GreenDart Inc. Team Blog

Stay updated with the latest trends at GreenDart Inc.. Please subscribe!

Subcategories from this category:

Cybersecurity

Remote Testing

OVERVIEW

Given the current COVID-19 environment and the pressure that puts on the mobility of test teams to complete Operational Test and Evaluation requirements on key programs, much more emphasis is now placed on integrating various aspects of remote testing into test event planning, execution and reporting. The three main test planning, execution and reporting efforts augmented by a more aggressive remote testing requirement are: pre-event information capture from non-traditional operational test sources; operational observation; event execution data collection and in-process event early assessments coordination.

PRE-EVENT DATA/INFORMATION/CAPABILITY CAPTURE

Techniques: Developmental test results analysis; customer, contractor, user community coordination for suitable data sources.

Tools: Developer/community test environments.


Continue reading
2493 Hits
0 Comments

Developmental Test and Evaluation

A key component of an effective T&E effort is Developmental Test and Evaluation (DT&E). DT&E is conducted by the system development organization. DT&E is performed throughout the acquisition and sustainment processes to verify that critical technical parameters have been achieved. DT&E supports the development and demonstration of new materiel or operational capabilities as early as possible in the acquisition life cycle. After the Full Rate Production (FRP) decision or fielding approval, DT&E supports the sustainment of systems to keep them current and extend their useful life, performance envelopes, and/or capabilities. Developmental testing must lead to and support a certification that the system is ready for dedicated operational testing.

DT&E efforts include:

  • Assess the technological capabilities of systems or concepts in support of requirements activities;
  • Evaluate and apply Modeling and Simulation (M&S) tools and digital system models;
  • Identify and help resolve deficiencies as early as possible;
  • Verify compliance with specifications, standards, and contracts;
  • Characterize system performance, military utility, and verify system safety;
  • Quantify contract technical performance and manufacturing quality;
  • Ensure fielded systems continue to perform as required in the face of changing operational requirements and threats;
  • Ensure all new developments, modifications, and upgrades address operational safety, suitability, and effectiveness;
  • During sustainment upgrades, support aging and surveillance programs, value engineering projects, productivity, reliability, availability and maintainability projects, technology insertions, and other modifications.

DT&E is typically conducted to verify and validate developer requirements as for example those specified in documents such as the Software Requirements Specification (SRS), which are derived from customer top-level specifications. The reports and other products of this (i.e., component) level of verification may also serve as required inputs to the system acceptance process.

How GreenDart can help you: We are proven experts in designing and executing T&E programs for all hardware and software developmental program efforts. Please contact us.

Please provide any comments you might have to this post.



Continue reading
2579 Hits
3 Comments

Operational T&E

The key step of an effective T&E effort, which drives product deployment, is the conduct of Operational Test and Evaluation (OT&E). OT&E is a formal test and analysis activity, performed in addition to and largely independent of the DT&E conducted by the development organization. OT&E brings a sharp focus on the probable success of a development article (software, hardware, complex systems) in terms of performing its intended mission once it is fielded. Probable success is evaluated primarily in terms of the “operational effectiveness” and “operational suitability” of the system in question. Operational effectiveness is a quantification of the contribution of the system to mission accomplishment under the intended and actual conditions of employment. Operational suitability is a quantification of system reliability and maintainability, the effort and level of training required to maintain, support and operate the system, and any unique logistics requirements of the system.

 

While DT&E comprehensively tests to the formal program requirements, OT&E concentrates on assessing the Critical Operational Issues (COI) identified for each program. Measures of Effectiveness (MOEs) are defined (ideally, early in the program life cycle) to support quantitative assessment of the COI. An MOE may reflect test results for one or several key requirements, while some (secondary) requirements may not map into any MOE. Similarly, the OT&E team uses Measures of Suitability (MOSs) to quantify development product performance against the “ilities” relevant to the particular development product. While the DT&E team may give little attention to evaluation of MOEs and MOSs, the OT&E team uses these technical measures extensively to focus their test planning and as a standardized and compact vehicle for communicating their findings to responsible decision authorities.

An OT&E campaign conventionally has two phases: an initial study and preparation phase, followed by a highly structured formal testing phase. Significant OT&E planning, assessment and preparation efforts occur throughout the first phase, which occurs in parallel to the development effort. Independent reports and recommendations are also made to the acquisition authority during this phase. The second phase, in DOD parlance known as Initial OT&E (IOT&E), follows final developer delivery of the target product, but precedes operational deployment. IOT&E is a series of scripted tests conducted on operational hardware, using developer-qualified deliveries, and under test conditions as representative as practical of the expected operational environment. If testing results meet pre-established criteria, IOT&E culminates with a recommendation to certify the development article for operational deployment. Once past the Full Rate Production milestone, Follow-on OT&E (FOT&E) of the development article may occur to verify the operational effectiveness and suitability of the production system, determine whether deficiencies identified during IOT&E have been corrected, and evaluate areas not tested during IOT&E due to system limitations. Additional FOT&E may be conducted over the life of the system to refine doctrine, tactics, techniques, and training programs and to evaluate future increments, modifications, and upgrades.
OT&E independence from the development program (including the program manager and immediate program sponsors) is a key attribute distinguishing it from DT&E. However, use of a common Test and Evaluation Master Plan (TEMP) is typical, and well controlled integrated DT/OT testing (integrated testing) is encouraged.

How GreenDart can help you: We are proven experts in designing and executing operational T&E programs for all hardware and software developmental program efforts. Please contact us.



Continue reading
1988 Hits
0 Comments

Agile T&E

Agile system development has emerged as an alternative to the long-standing “waterfall” software development process. Agile development involves the rapid development of incremental system capabilities. These incremental development activities are called “sprints”. At the start the program selects requirements from the overall system requirements specification, builds user stories around those requirements, and allocates those user stories to specific sprint events. During each sprint event the developers go through a mini-waterfall effort of requirements, design, code, and test of a very small segment of the overall system. Once the sprint is complete the resultant product is typically integrated into the evolving overall system. Any unfulfilled sprint requirements go into a requirements holding ledger called the Product Backlog (PBK) for reassignment to future sprints. Sprint re-planning occurs, as needed, based on the accomplishments of previous sprints.

 

Testing within each sprint roughly resembles a very short waterfall testing effort, as described in our T&E description earlier. However, for this discussion, we focus on the unique Agile T&E activities that occur outside of each sprint. These activities include requirements verification, trace, and test results assessments.

T&E validates the user stories and associated critical technical parameters against top level requirements, identifying any issues early in the sprint development cycles. As sprints are executed and various levels of story “completion” are achieved, the PBK is updated to re-capture and re-plan those requirements that were not completed during the sprint or that underwent vital user-driven updates. Perturbations to delivered incremental capabilities and changes to go-forward strategies are quite common. These changes are captured by a dynamic T&E planning effort. Finally, because the programs are typically on a rapid 2-week sprint cadence, the T&E engagement and reporting cycles are quite short. This creates additional T&E/developer coordination opportunities, which improves T&E planning timeliness.

At the end of each sprint, the T&E team assess the achieved sprint requirements, now integrated into the evolving target system. Tests are performed for those specific requirements, and regression tests on existing capabilities, with each sprint integration. Successful T&E results drive capability acceptance while T&E failures drive PBK updates and future sprint re-planning efforts.

Continue reading
2134 Hits
0 Comments

Design of Experiments

The key test optimization opportunity of an effective T&E effort is the design and execution of Design of Experiments (DOE). DOE is a systematic method to determine the relation between inputs or factors affecting a process or system and the output of that system. The system under test may be a process, a machine, a natural biological system or many other dynamic entities. This discussion concerns use of DOE for testing a software intensive system (a standalone program, or integrated hardware and software).


Test planners have a range of software test strategies and techniques to choose from in developing a detailed test plan. The choices made will depend on the integration level (i.e., unit test to system of systems T&E) of the target test article, as well as the specified and generated test requirements. Typically, the complete test plan will involve a combination of these techniques. Most of them are commonly known, but applying DOE for testing software intensive systems may not be as familiar. In this context, the design in “DOE” is a devised collection of test cases (experimental runs) selected to efficiently answer one or more questions about the system under test.  This test case collection may comprise a complete software test plan, or a component of that plan.

DOE can save significant test time within the overall  DT&E and/or OT&E efforts. In one particular instance GreenDart achieved 66% DT&E schedule savings through the successful application of DOE.

 


Continue reading
1860 Hits
0 Comments

Cybersecurity Resilience


 

Cybersecurity, cyber resilience, operational resilience. Once we think we have grasped the inputs, outputs, expectations, and requirements of one word, industry shifts and new terminology arises. The conversation is one of nuance, encumbered by terminology and boundary differences. These terms are fairly new and easily misused and misunderstood.  For all intents and purposes within the IT space, Cyber Resilience is our term of choice. Cyber Resilience refers to an entity’s ability to withstand and recover from a cyber event. It is measurable in regards to the operational evaluation of an entity or system.

The key question Cyber Resilience addresses is:

How protected and resilient are the internal system attributes (applications, data, controls, etc.) assuming the threat has already penetrated the external cybersecurity protections?


Continue reading
2586 Hits
0 Comments

Data Analytics

The science of analyzing raw data in order to make insightful conclusions about information; this is Data Analytics. A vast array of techniques and processes innately performed by data analysts have been automated into algorithms that digest and distill raw data into meaningful responses. This evaluation of data analysis is a critical foundation in a world dominated by the availability and manipulation of data.  Each day we utilize our smartphones, smart TV’s, smart cars, and smart home devices in the Internet of Things. In our everyday utilization we provide these devices with an immense amount of data on us as people and our market behaviors. We create patterns and natural rule sets that these systems process and ultimately and iteratively learn from. The world of data analytics is an ever-evolving space for awaiting the perfect answer to the questions yet asked.

 

APPROACH

GreenDart operates at the forefront of industry by maximizing involvement in the world of data analytics.  GreenDart is instrumental in advancing a law enforcement led program of creating a single data set of unified law enforcement data, giving analysts a single platform to develop and train more effective algorithms. This platform ingests multiple sources and allow analysts to create patterns and rule sets to identify outliers for further inspection. Once initial digestion is complete, the computer learns new patterns based on these initial algorithms, optimally resulting in predictive identification and rule tailoring.  GreenDart is essential in addressing the gaps between requirements and reaching the solution expected by the customer.  Great emphasis is put on the ability of the computer to learn new patterns with minimal input from the analyst.

 


Continue reading
2440 Hits
4 Comments

Test Program Verification

For very high value or otherwise critical development efforts, a customer may elect to procure an independent assessment of the system developer’s T&E program. This effort is typically known as Test Program Verification or simply Test Verification. The Test Verification agent may be brought in by the developer (but kept distinct from the developer’s T&E organization), or the agent may be directly hired by the Government customer, to achieve a higher degree of independence.

Since the intent is to drive rigorous developmental T&E effort, many of the same activities and issues discussed in our DT&E write-up are relevant to this effort, although the perspective is different (e.g., the Test Verification agent does not plan or execute tests). Although both the Test Verification agent and OT&E agent may both be members of an Integrated Test Team, their mutual interaction is apt to limited.

Test Verification involves many of the Test Verification steps defined in GreenDart’s Verification and Validation – Test description. However, the target for this effort is to review and assess the developer’s Requirements Verification Plan (RVP) and their Requirements Verification Report (RVR). Successful assessment of these developer products is critical to achieve customer confidence in the developer’s test program and, therefore, confidence in the successful delivery of the desired system. The figure below shows a notional Test Verification process flow.

 

RVP/RVR Assessment Process Flow

Continue reading
2075 Hits
0 Comments

Test and Evaluation

 

The objective of the Test and Evaluation function is to systematically assess the products of a development effort against the stated requirements, provide decision authorities with objective information characterizing the progress of the development effort, help manage risks, identify and subsequently confirm resolution of deficiencies, and ensure systems are operationally mission capable. T&E activities demonstrate the system meets established operational objectives, capability-based requirements, and mandated exit criteria before moving to the next development phase.

To accomplish this, the T&E agents perform the following:

  • Collaborate with requirements sponsors and system developers to gain insight into program requirements that drive T&E efforts;
  • Provide timely, accurate, and affordable information to decision makers to support production and fielding decisions;
  • Perform testing to insure that all requirements are satisfied;
  • Perform independent test results evaluations where critical, to confirm compliance;
  • Develop detailed assessments of system with respect to operational effectiveness, survivability, operational suitability, support logistics, operator training, and system security.
  • Provide information to operators to allow them to assess mission impacts, develop doctrines, improve requirements, develop logistics and training plans, and refine Tactics, Techniques, and Procedures (TTP)

The following T&E principles are based on Department of Defense (DOD) 5000-series documents and GreenDart lessons learned. The unifying theme is that the T&E team must collaborate across the customer, developer, test, and user communities to achieve the most effective T&E results.

  • Program Specific Tailoring. All T&E strategies and procedures are adaptable to fit the needs of acquisition programs consistent with sound systems engineering practices, common sense, statutory and regulatory guidelines, and the time-sensitive nature of operators’ requirements. T&E plans are formed by tailoring standard strategies and procedures to address specific program requirements, with an expectation that in-progress adjustments may be needed.
  • Early Tester Involvement. The early incorporation of T&E expertise and operational insight, preferably before the concept refinement phase, is a key to successful initiation of new programs. In some cases, T&E may enter at a more mature phase. In these instances, substantial background needs to be covered to be effective.
  • Early Deficiency Resolution. Deficiencies must be identified and resolved as early as possible to mitigate development costs impacts.
  • Schedules and Exit Criteria Observed. T&E efforts are allocated against developer schedules and operational deployment timelines to achieve timely system performance verification. Satisfaction of development milestone exit criteria, whether statutory, policy driven, or customer specified are explicitly addressed in T&E planning.

How GreenDart can help you: We are proven experts in designing and executing T&E programs for all hardware and software developmental program efforts. Please contact us.

Continue reading
2244 Hits
0 Comments