2019 ITEA Journal Issue Abstracts

2019 ITEA Journal Issue Abstracts

ONLINE Issues (ITEA Members Only)


March 2019 – Statistical Methods in T&E

The ITEA Journal of Test and Evaluation - March 2019, Vol. 40, No. 1

The theme for this issue is “Statistical Methods in T&E,” and the issue includes the President’s Corner, a Guest Editorial, a feature discussing the upcoming 2019 ITEA Symposium, a feature for “Capturing the Knowledge Base of Test and Evaluation” that will be part of the ITEA Test and Evaluation Handbook, and six technical articles.

President’s Corner, written by William Keegan, President of the Board of Directors for ITEA, covers the state of the association, ITEA accomplishments, and upcoming ITEA events.

Our Guest Editorial is an interview of Greg Zacharias, Ph.D., the Chief Scientist for the Director of Operational Test and Evaluation (DOT&E), conducted by Laura Freeman, Ph.D., titled “A Conversation with the DOT&E Chief Scientist.” In this article, Dr. Zacharias discusses his career in the Air Force, academic background, DOT&E initiatives, accelerating acquisition, use of modeling and simulation, human-systems integration, and autonomous systems testing.

The next feature is an article planned to be part of the ITEA Test and Evaluation (T&E) Handbook. For this issue, Laura Freeman, Ph.D., et al., in “Designing Experiments for Model Validation – The Foundations for Uncertainty Quantification,” state that the validity of information provided by models is dependent on the rigorous verification and validation processes used. The authors propose Design of Experiments and statistical analyses as methods to support statistical comparison of simulation to live data. (Note: Readers can suggest edits/additions to this article or to the draft outline of the handbook at CTEP-handbook@itea.org. Readers can find the proposed outline for the handbook in Issue 39(4), December 2018, of The ITEA Journal of Test and Evaluation titled “Capturing the Knowledge Base of Test and Evaluation” on pages 215-216. The topic of this second article for The Handbook is listed under subheading of “Data Collection and Analysis” and topic “Statistics principles and tools.”)

Our next feature is a description of the venue for the 2019 Symposium. We will be holding this Symposium in the Pacific Command’s warfighter zone. I’ve always enjoyed ITEA events where DoD acquisition and industry leaders provide recommendations to T&E. In this case, it will be the warfighters providing some measure of feedback. The discussion in this article is about the venue in terms of military testing and experimentation and the significance of the DoD work being done in the region.

Subsequent articles in June and September will cover the event and the program development. Any tours we can arrange will also be covered in those issues. Our first of six technical articles, “The Effect of Extremes in Small Sample Size on Simple Mixed Models: A Comparison of Level-1 and Level-2 Size,” written by Kristina Carter, et al., summarizes a simulation study to examine the impact of small samples in observation and nesting levels of the model. The authors found that, in some cases, the rules of thumb for sample sizes could be reduced to still attain sufficient power in certain circumstances related to the type of factor, acceptability of a higher risk of a type I error, or when the minimum effect worthy of detection is large.

The second technical article in this issue, “Applying Scientific Test and Analysis Techniques to Defense Business Systems Test and Evaluation” written by William Rowell, Ph.D., states that there is interest in applying Scientific Test and Analysis Techniques (STAT) to test and evaluation of Defense Business Systems (DBS). The author has found that the application of STAT tools for T&E of DBS is an iterative process that, when applied correctly, ensures adequate coverage of inputs and use cases and facilitates optimal use of the test resources.

For our next article, “The Value of Scientific Test and Analysis Techniques in Test and Evaluation” by Terry Murphy and Kyle Kolsti, Ph.D., the authors propose methods to improve T&E outcomes. They describe how Scientific Test and Analysis Techniques (STAT) can be applied to the planning, execution, and assessment phases of a typical test program. They state that use of these methods will improve the analytical skills of the T&E workforce and provide higher quality information for decision makers.

In our fourth technical article, “Lessons Learned from an Incompletely Randomized Test Design,” Michael Harman illustrates how incomplete randomization and a flawed design can significantly impact the analysis and derived conclusions from the data. Intelligent selection of designs can reduce the sample sizes by more than 50% in the example used, account for the way the test will be executed, and improve the likelihood of a complete analysis using quantifiable information.

In the fifth technical article, “A Hybrid Approach to Big Data on High Performance Computing for Army Test and Evaluation,” Brian Panneton, et al., describe the prototype of a scalable big data stack within a traditional High Performance Computing (HPC) architecture, and they explain the rationale behind this work. The goal of the effort is to show an example of how big data architecture can be used throughout the United States Army. This work will take the current progress one step closer to having real-time analysis capability.

For the sixth and last article in this issue, “Know the SCOR for a Multifactor Strategy of Experimentation, Screening, Characterization, Optimization, and Ruggedness (SCOR) Testing,” Mark Anderson presents a paper highlighting how Design of Experiments can be used in a common sense way to step through modeling of the system to eventually identifying the design of a robust system. The author illustrates the process in steps and shows how this sequence can result in a system that is characterized and ruggedized.

I hope you enjoy this first issue of 2019 for The ITEA Journal of Test and Evaluation. By the time you receive issue 40-1 in March, the June 2019 issue 40-2 is being finalized. That theme in June will be “Accelerating Test and Evaluation with LVC and Agile.” For the next issue (the third issue of 2019), 40-3, the deadline for submissions is just after June 1, 2019, and the theme will be “Aligning Modernization of DoD Test Ranges with National Defense Strategy.” We have posted all themes and descriptions for the remainder of 2020-2021 on the ITEA website. Please provide feedback on the choice of themes, and please write early and often.