2019 ITEA Journal Issue Abstracts

2019 ITEA Journal Issue Abstracts

ONLINE Issues (ITEA Members Only)


June 2019 – Accelerating Test and Evaluation with LVC and Agile

The theme for this issue is “Accelerating Test and Evaluation with LVC and Agile,” and the issue includes the President’s Corner, two Guest Editorials, an Inside the Beltway, a feature discussing the upcoming 2019 International Test and Evaluation Association (ITEA) Symposium, and seven technical articles. The term “LVC” has many meanings depending on who uses it, what is the purpose of the discussion, and who is trying to understand it. For this purpose, we want to promote a discussion of the ways that a mix of Live (humans operating actual systems), Virtual (humans operating in simulators), and Constructive (humans controlling computer simulations) can be used to facilitate system trade-space studies, perform evaluations of proposed systems, or compare current to proposed systems in simulated combat environments, as just some examples.

President’s Corner, written by William Keegan, ITEA President, covers the state of the association, ITEA accomplishments, and upcoming ITEA events.

Our first Guest Editorial is “Sometimes a Picture Is Worth a Thousand Words” written by Steve Hutchison, Ph.D., as a reminder of the critical importance of test and evaluation (T&E). Thorough testing helps make sure that systems work when operators expect, want, and need them.

Our second Guest Editorial is “Cyber Testing in a Rapid DoD Acquisition Environment” from Kevin McGowan. He states that the desire to use Agile methods and rapid prototyping acquisition strategies presents some challenges to the cyber test community. He describes several ways to fully assess and incorporate cyber survivability, even as acquisition and T&E are accelerated, including continuous two-way communication and cooperation between developers, testers, and customers.

For our Inside the Beltway feature, “Going Faster: Implications for Test and Evaluation,” Eileen Bjorkman, Ph.D., explains that regardless of how we label it, DoD acquisition programs are trending toward more rapid fielding and accepting more risk. She also explains that the systems and techniques used to test new systems must be in place before the new systems can be accelerated through T&E.

Our next feature, “Let’s Meet in Kauai One More Time,” is a description of the plans for the 2019 Symposium. We will be holding this Symposium in Kauai for a third time. With 11 military bases in Hawaii, including the largest instrumented multi-environment training and test range in the world on Kauai, this will be an ideal location to hear feedback from warfighters and presentations by acquisition and T&E professionals. This article describes the Symposium’s Technical Program and outlines the basic sequence of events. More information will be provided on the ITEA website, in regular electronic mail announcements, and in a final article in the September issue.

Our first of seven technical articles, “Lessons from Past Rapid Acquisition Programs,” written by Lindsey Davis, Ph.D., summarizes the lessons learned from previous rapid acquisition programs in order to help establish recommended best practices for future programs. She provides case studies to illustrate some of the lessons learned and some best practices that may improve the chances of successful rapid fielding.

The second technical article in this issue, “Test in the Age of Agile: Rising to the Challenge of Agile Software Development,” written by Colonel Douglas Wickert, Ph.D., states that adapting Agile methods for military systems requires careful consideration and changes to traditional T&E methods. He presents the unique challenges related to testing in an Agile process and continuous development frameworks. We must ensure that speed is vectored in a direction that serves the needs of warfighters.

For our next article, “Self Service Infrastructure Environment for Next Generation High Performance Test and Evaluation (T&E),” by Chuck Reynolds and Steve Campbell, the authors explain a growing trend of high-performance computing centers using self-service portals to schedule, reserve, configure, and deliver solutions. The authors describe their proposed solutions to providing T&E as a self-service high performance computing solution.

In our fourth technical article, “A New Strategy for Funding Test and Evaluation Range Infrastructure,” Jeffrey Riemer, Ph.D., describes an alternative funding strategy that can prevent delays in range availability and reduce the risk of not having the necessary T&E infrastructure when needed. The author’s plan leverages the concept or recoupment, and explains how this can be a win-win for program offices and the T&E ranges.

In the fifth technical article, “Environmental Challenges and Range Sustainability,” Paul Knight addresses the challenges faced by Department of Defense (DoD) test ranges, and he explains the best practices used by ranges to mitigate the impacts of the challenges. With proactive engagement and long-term commitment to stay ahead of the challenges, ranges can be sustained and made ready for accelerated T&E.

For the sixth article, R. Douglas Flournoy, et al., present “StreamServer for Fast Data Analytics” to highlight a class of efficient processing methods to analyze the contents of high velocity, high volume data streams in real time. In their design, the authors achieved up to 25 times better streaming throughput, and they hinted at new research that may allow another leap in throughput speeds.

For the seventh and last article in this issue, “Can Agile, Systems Engineering, and Independent T&E Coexist and Cooperate?,” Dave Brown, Ph.D., and Dave Bell, Ph.D., state that systems engineering provides the toplevel structure and process to integrate the Agile process into large scale developments. They explain that Agile teams generally use the DevSecOps continuous process, generally with significant automation, for delivery of incrementally improved software. The answer to the question in the title is “yes” with the conclusion that the combination of Agile, systems engineering, and independent T&E shows great promise to enhance the development of complex systems.

I hope you enjoy this second issue of 2019 for The ITEA Journal of Test and Evaluation. By the time you receive issue 40-2 in June, the September 2019 issue 40-3 is being finalized. That theme in September will be “Aligning Modernization of DoD Test Ranges with National Defense Strategy.” For the next issue (the last issue of 2019), 40-4, the deadline for submissions is just after September 1, 2019, and the theme will be “Drowning in Data: How to Gain Timely Information and Knowledge from Data.” We have posted all themes and descriptions for 2020 and 2021 on the ITEA website. We will post more themes later in 2019. Please provide feedback on the choice of themes, and please write early and often.


March 2019 – Statistical Methods in T&E

The ITEA Journal of Test and Evaluation - March 2019, Vol. 40, No. 1

The theme for this issue is “Statistical Methods in T&E,” and the issue includes the President’s Corner, a Guest Editorial, a feature discussing the upcoming 2019 ITEA Symposium, a feature for “Capturing the Knowledge Base of Test and Evaluation” that will be part of the ITEA Test and Evaluation Handbook, and six technical articles.

President’s Corner, written by William Keegan, President of the Board of Directors for ITEA, covers the state of the association, ITEA accomplishments, and upcoming ITEA events.

Our Guest Editorial is an interview of Greg Zacharias, Ph.D., the Chief Scientist for the Director of Operational Test and Evaluation (DOT&E), conducted by Laura Freeman, Ph.D., titled “A Conversation with the DOT&E Chief Scientist.” In this article, Dr. Zacharias discusses his career in the Air Force, academic background, DOT&E initiatives, accelerating acquisition, use of modeling and simulation, human-systems integration, and autonomous systems testing.

The next feature is an article planned to be part of the ITEA Test and Evaluation (T&E) Handbook. For this issue, Laura Freeman, Ph.D., et al., in “Designing Experiments for Model Validation – The Foundations for Uncertainty Quantification,” state that the validity of information provided by models is dependent on the rigorous verification and validation processes used. The authors propose Design of Experiments and statistical analyses as methods to support statistical comparison of simulation to live data. (Note: Readers can suggest edits/additions to this article or to the draft outline of the handbook at CTEP-handbook@itea.org. Readers can find the proposed outline for the handbook in Issue 39(4), December 2018, of The ITEA Journal of Test and Evaluation titled “Capturing the Knowledge Base of Test and Evaluation” on pages 215-216. The topic of this second article for The Handbook is listed under subheading of “Data Collection and Analysis” and topic “Statistics principles and tools.”)

Our next feature is a description of the venue for the 2019 Symposium. We will be holding this Symposium in the Pacific Command’s warfighter zone. I’ve always enjoyed ITEA events where DoD acquisition and industry leaders provide recommendations to T&E. In this case, it will be the warfighters providing some measure of feedback. The discussion in this article is about the venue in terms of military testing and experimentation and the significance of the DoD work being done in the region.

Subsequent articles in June and September will cover the event and the program development. Any tours we can arrange will also be covered in those issues. Our first of six technical articles, “The Effect of Extremes in Small Sample Size on Simple Mixed Models: A Comparison of Level-1 and Level-2 Size,” written by Kristina Carter, et al., summarizes a simulation study to examine the impact of small samples in observation and nesting levels of the model. The authors found that, in some cases, the rules of thumb for sample sizes could be reduced to still attain sufficient power in certain circumstances related to the type of factor, acceptability of a higher risk of a type I error, or when the minimum effect worthy of detection is large.

The second technical article in this issue, “Applying Scientific Test and Analysis Techniques to Defense Business Systems Test and Evaluation” written by William Rowell, Ph.D., states that there is interest in applying Scientific Test and Analysis Techniques (STAT) to test and evaluation of Defense Business Systems (DBS). The author has found that the application of STAT tools for T&E of DBS is an iterative process that, when applied correctly, ensures adequate coverage of inputs and use cases and facilitates optimal use of the test resources.

For our next article, “The Value of Scientific Test and Analysis Techniques in Test and Evaluation” by Terry Murphy and Kyle Kolsti, Ph.D., the authors propose methods to improve T&E outcomes. They describe how Scientific Test and Analysis Techniques (STAT) can be applied to the planning, execution, and assessment phases of a typical test program. They state that use of these methods will improve the analytical skills of the T&E workforce and provide higher quality information for decision makers.

In our fourth technical article, “Lessons Learned from an Incompletely Randomized Test Design,” Michael Harman illustrates how incomplete randomization and a flawed design can significantly impact the analysis and derived conclusions from the data. Intelligent selection of designs can reduce the sample sizes by more than 50% in the example used, account for the way the test will be executed, and improve the likelihood of a complete analysis using quantifiable information.

In the fifth technical article, “A Hybrid Approach to Big Data on High Performance Computing for Army Test and Evaluation,” Brian Panneton, et al., describe the prototype of a scalable big data stack within a traditional High Performance Computing (HPC) architecture, and they explain the rationale behind this work. The goal of the effort is to show an example of how big data architecture can be used throughout the United States Army. This work will take the current progress one step closer to having real-time analysis capability.

For the sixth and last article in this issue, “Know the SCOR for a Multifactor Strategy of Experimentation, Screening, Characterization, Optimization, and Ruggedness (SCOR) Testing,” Mark Anderson presents a paper highlighting how Design of Experiments can be used in a common sense way to step through modeling of the system to eventually identifying the design of a robust system. The author illustrates the process in steps and shows how this sequence can result in a system that is characterized and ruggedized.

I hope you enjoy this first issue of 2019 for The ITEA Journal of Test and Evaluation. By the time you receive issue 40-1 in March, the June 2019 issue 40-2 is being finalized. That theme in June will be “Accelerating Test and Evaluation with LVC and Agile.” For the next issue (the third issue of 2019), 40-3, the deadline for submissions is just after June 1, 2019, and the theme will be “Aligning Modernization of DoD Test Ranges with National Defense Strategy.” We have posted all themes and descriptions for the remainder of 2020-2021 on the ITEA website. Please provide feedback on the choice of themes, and please write early and often.