September 2019 – Aligning Modernization of DoD Test Ranges with National Defense Strategy
The theme for this issue is “Aligning Modernization of DoD Test Ranges with National Defense Strategy,” and the issue includes the President’s Corner, a Guest Editorial, an Inside the Beltway, a Test and Evaluation (T&E) Handbook feature, an interview with the Pacific Missile Range Facility Commander, a feature on Helping Develop the Future Workforce, and four technical articles.
The President’s Corner, written by COL William Keegan, USA (Ret.), ITEA President, covers the state of the association, ITEA accomplishments, and upcoming ITEA events. He mentions that our 2019 Symposium is in Lihue, Kauai, during November 17- 2019. The Pacific Missile Range Facility (PMRF), on Kauai, is one of many military facilities in Hawaii, and the PMRF is the largest instrumented multienvironment range in the world. An interview with the PMRF Commander is one of the features in this issue.
Our Guest Editorial is “A Conversation with Leadership on Aligning Test & Evaluation Infrastructure to the National Defense Strategy” conducted by Heather Wojton, PhD, Chad Bieber, PhD, and Ryan Norman. This feature is an extensive conversation with Mr. James Faist, Director of Defense Research and Engineering for Advanced Capabilities, Under Secretary of Defense Research and Engineering, Office of the Secretary of Defense, and with Mr. Dave Duma, Principle Deputy Director, Operational Test and Evaluation. Topics of this conversation include positioning the Department of Defense to achieve the three goals in the National Defense Strategy, key emerging technologies that must be tested, investments to enable T&E to support these technologies, and the infrastructure and processes to improve the efficiency of T&E.
For our Inside the Beltway feature, “NAVAIR’s Capabilities Based Test & Evaluation—Test Like we Fight!,” Ken Senechal explains that for some complex acquisition programs, our defense acquisition system is extremely slow. The author states that we can make our testing more efficient and provide the earliest notification possible for mission deficiencies. He describes several ways that innovations in testing can help provide capability to the warfighter sooner.
The next feature discusses a topic related to the Test and Evaluation Handbook. Mark London, PhD, covers “Sample Sizing for Binary Data Structures.” Mark states that one common challenge confronted by engineers is determining just how much data must be collected to satisfactorily verify a particular system requirement. His paper provides a summary of commonly-employed sample size estimation methods for the simplest type of data set, namely binary data.
Our next feature “A Conversation with the Commanding Officer of the Pacific Missile Range Facility (PMRF)” is an interview, conducted by Doug Messer, ITEA Events Committee Chair, with the Commander of a military installation near the location of the 2019 Symposium on Kauai. PMRF is the world’s largest instrumented multi-environment range, capable of supporting subsurface, surface, air, and space testing and training operations.
Our last feature is the “2019 Update on Simulation: Helping Develop the Future Workforce” written by Steve Gordon, PhD, as an update for the Modeling and Simulation Professional Certification for High School teachers and students. The number of certified teachers and students in Florida is growing, and many Virginia teachers and students have also been certified. This certification is one way to motivate students in science, engineering, and the electronic arts in order to grow the future T&E workforce. Our first of four technical articles, “Advanced Systems Test and Evaluation (T&E) on DoD Ranges,” written by Jay Clark, et al., summarizes the developments undertaken to be able to test and train with supersonic missiles and establish a Hypersonic Exercise Coordination Center to streamline the supersonic/hypersonic planning process. Hypersonic system testing and training will require increased coordination because of the operational range of these systems.
The second technical article in this issue, “A Conceptual Framework for Flight Test Management and Execution Utilizing Agile Development and Project Management Concepts” by Craig Hatcher, discusses how adopting an Agile methodology for project execution and management may provide a more natural way of measuring true project progress. He also points out the hazards of following Agile methods in some test programs. For instance, the author recommends a hybrid approach to Agile in order to better fit flight test programs.
Our third article of this issue is “Model-Based Systems and Test Engineering Using System Mission-Oriented Design-of-Experiment Approach” from Ya Li, PhD, and he discusses the mission-oriented Design-of-Experiment (DoE) test approach. The author states 40(3) • September 2019 | 151 that the use of DoE and Response Surface Methodology techniques appears to remedy some of the difficulty of Model-Based Systems Engineering’s identification and assessment of integrated system-level capability. This use of DoE also has provided the ability to objectively determine specific areas, within the system’s operational environment, where potential system-of-interest mission failures or inadequacies exist. DoE may also help identify the culprit element(s) that contributes to the potential failures and inadequacies.
Our final technical article is “Using Mixed Reality Technologies for Planning and Visualization of the Test Process” authored by Simon Su, PhD, et al. The authors present a method to display data in 3 dimensional (3D) space. The purpose of this project was to research, experiment with, and implement new approaches for modeling and representing data in a nontraditional way. It was discovered that modeling 3D data in a 3D digital space allows manipulating it as if it is in the user’s real-world space, providing versatility in a way not possible with 2D representations. Representing the 3D data in a 3D environment can introduce new derived parameters or statistical measures directly from the visualization, and this can be completed on a variety of display devices. The process reduces the number of steps required to reach a conclusion about the data, derives meaning from the data, and reduces the cognitive workload on the users and analysts.
I hope you enjoy this third issue of 2019 for The ITEA Journal of Test and Evaluation. By the time you receive issue 40-3 in September, the December 2019 issue 40-4 is being finalized. That theme in December will be “Drowning in Data: How to Gain Timely Information and Knowledge from Data.” For the next issue (the first issue of 2020), 41-1, the deadline for submissions is just after December 1, 2019, and the theme, if no changes are made, will be “The Right Mix of T&E Infrastructure.”
We have posted all themes and descriptions for 2020 and 2021 on the ITEA website; however, these may be changed as a new Publications Chair transitions with the December issue. Please provide feedback on the choice of themes, and please write early and often.
I look forward to the December 2019 issue of The ITEA Journal of Test and Evaluation, developed by ITEA Publication Committee under the leadership of Laura J. Freeman, PhD, The ITEA Journal Editor-in-Chief; Assistant Dean for Research in the National Capital Region, College of Science; Associate Director, Intelligent Systems Lab, Hume Center; and Research Associate Professor, Department of Statistics, Virginia Tech.
My time on the ITEA Publications Committee started under the leadership of J. Michael Barton, PhD; and, then, when I became the Publications Chair, the first issue I planned and organized was the March 2013 issue. I thank all of you that endured all the issues of The ITEA Journal after that. I’ve had the pleasure of working with the outstanding members of the Publications Committee for these many years; they have helped me develop themes and find articles for each issue. I am honored to have served with these technical experts! Since you are reading this and, I hope, other issues of The ITEA Journal, I also thank you, the readers. Please continue reading, provide feedback, and offer your own articles to The Journal. There’s also a team, under the guidance of the ITEA Executive Director James Gaidry, that edits, formats, and arranges articles for each issue before it is sent to the printer. That team is now Brand Design, Inc. For authors that have submitted articles in the last few years to The ITEA Journal, you’ve worked with Linda and Janet as they helped you finalize your article for publication.
They put up with my broken deadlines and changing priorities; so I thank them for their help to our authors and me and for making a great journal for our members and other readers. One final explanation: for my last issue of The ITEA Journal of Test and Evaluation, I wanted a cover graphic that would cover the theme and, in some way, look at the recent themes to see how they have supported Test and Evaluation (T&E) needs. The theme of this September 2019 issue is “Aligning DoD Test Ranges with National Defense Strategy,” and we wondered how well The Journal supported the T&E side of the National Defense Strategy. For that analysis, we used the Introduction section of the 2018 Director, Operational Test and Evaluation, FY2018 Annual Report (https://www.dote.osd.mil/pub/reports/FY2018/pdf/ther/2018dirintro.pdf) as an outline to help frame our analysis. We then surveyed the last 30+ issues to see if our themes supported T&E and National Defense. In the cover graphic, we have arranged the most recent issues based on key topics in the introduction to the referenced report. The findings from this survey of 30+ issues include that there were few articles dedicated to workforce (recruit, retain, train) issues, few articles covering range capabilities and future needs, and even fewer articles dedicated to international T&E collaboration; we hope some of these deficiencies can be corrected by recruiting articles on these topics in the future.
Steve Gordon, PhD
June 2019 – Accelerating Test and Evaluation with LVC and Agile
The theme for this issue is “Accelerating Test and Evaluation with LVC and Agile,” and the issue includes the President’s Corner, two Guest Editorials, an Inside the Beltway, a feature discussing the upcoming 2019 International Test and Evaluation Association (ITEA) Symposium, and seven technical articles. The term “LVC” has many meanings depending on who uses it, what is the purpose of the discussion, and who is trying to understand it. For this purpose, we want to promote a discussion of the ways that a mix of Live (humans operating actual systems), Virtual (humans operating in simulators), and Constructive (humans controlling computer simulations) can be used to facilitate system trade-space studies, perform evaluations of proposed systems, or compare current to proposed systems in simulated combat environments, as just some examples.
President’s Corner, written by William Keegan, ITEA President, covers the state of the association, ITEA accomplishments, and upcoming ITEA events.
Our first Guest Editorial is “Sometimes a Picture Is Worth a Thousand Words” written by Steve Hutchison, Ph.D., as a reminder of the critical importance of test and evaluation (T&E). Thorough testing helps make sure that systems work when operators expect, want, and need them.
Our second Guest Editorial is “Cyber Testing in a Rapid DoD Acquisition Environment” from Kevin McGowan. He states that the desire to use Agile methods and rapid prototyping acquisition strategies presents some challenges to the cyber test community. He describes several ways to fully assess and incorporate cyber survivability, even as acquisition and T&E are accelerated, including continuous two-way communication and cooperation between developers, testers, and customers.
For our Inside the Beltway feature, “Going Faster: Implications for Test and Evaluation,” Eileen Bjorkman, Ph.D., explains that regardless of how we label it, DoD acquisition programs are trending toward more rapid fielding and accepting more risk. She also explains that the systems and techniques used to test new systems must be in place before the new systems can be accelerated through T&E.
Our next feature, “Let’s Meet in Kauai One More Time,” is a description of the plans for the 2019 Symposium. We will be holding this Symposium in Kauai for a third time. With 11 military bases in Hawaii, including the largest instrumented multi-environment training and test range in the world on Kauai, this will be an ideal location to hear feedback from warfighters and presentations by acquisition and T&E professionals. This article describes the Symposium’s Technical Program and outlines the basic sequence of events. More information will be provided on the ITEA website, in regular electronic mail announcements, and in a final article in the September issue.
Our first of seven technical articles, “Lessons from Past Rapid Acquisition Programs,” written by Lindsey Davis, Ph.D., summarizes the lessons learned from previous rapid acquisition programs in order to help establish recommended best practices for future programs. She provides case studies to illustrate some of the lessons learned and some best practices that may improve the chances of successful rapid fielding.
The second technical article in this issue, “Test in the Age of Agile: Rising to the Challenge of Agile Software Development,” written by Colonel Douglas Wickert, Ph.D., states that adapting Agile methods for military systems requires careful consideration and changes to traditional T&E methods. He presents the unique challenges related to testing in an Agile process and continuous development frameworks. We must ensure that speed is vectored in a direction that serves the needs of warfighters.
For our next article, “Self Service Infrastructure Environment for Next Generation High Performance Test and Evaluation (T&E),” by Chuck Reynolds and Steve Campbell, the authors explain a growing trend of high-performance computing centers using self-service portals to schedule, reserve, configure, and deliver solutions. The authors describe their proposed solutions to providing T&E as a self-service high performance computing solution.
In our fourth technical article, “A New Strategy for Funding Test and Evaluation Range Infrastructure,” Jeffrey Riemer, Ph.D., describes an alternative funding strategy that can prevent delays in range availability and reduce the risk of not having the necessary T&E infrastructure when needed. The author’s plan leverages the concept or recoupment, and explains how this can be a win-win for program offices and the T&E ranges.
In the fifth technical article, “Environmental Challenges and Range Sustainability,” Paul Knight addresses the challenges faced by Department of Defense (DoD) test ranges, and he explains the best practices used by ranges to mitigate the impacts of the challenges. With proactive engagement and long-term commitment to stay ahead of the challenges, ranges can be sustained and made ready for accelerated T&E.
For the sixth article, R. Douglas Flournoy, et al., present “StreamServer for Fast Data Analytics” to highlight a class of efficient processing methods to analyze the contents of high velocity, high volume data streams in real time. In their design, the authors achieved up to 25 times better streaming throughput, and they hinted at new research that may allow another leap in throughput speeds.
For the seventh and last article in this issue, “Can Agile, Systems Engineering, and Independent T&E Coexist and Cooperate?,” Dave Brown, Ph.D., and Dave Bell, Ph.D., state that systems engineering provides the toplevel structure and process to integrate the Agile process into large scale developments. They explain that Agile teams generally use the DevSecOps continuous process, generally with significant automation, for delivery of incrementally improved software. The answer to the question in the title is “yes” with the conclusion that the combination of Agile, systems engineering, and independent T&E shows great promise to enhance the development of complex systems.
I hope you enjoy this second issue of 2019 for The ITEA Journal of Test and Evaluation. By the time you receive issue 40-2 in June, the September 2019 issue 40-3 is being finalized. That theme in September will be “Aligning Modernization of DoD Test Ranges with National Defense Strategy.” For the next issue (the last issue of 2019), 40-4, the deadline for submissions is just after September 1, 2019, and the theme will be “Drowning in Data: How to Gain Timely Information and Knowledge from Data.” We have posted all themes and descriptions for 2020 and 2021 on the ITEA website. We will post more themes later in 2019. Please provide feedback on the choice of themes, and please write early and often.
March 2019 – Statistical Methods in T&E
The theme for this issue is “Statistical Methods in T&E,” and the issue includes the President’s Corner, a Guest Editorial, a feature discussing the upcoming 2019 ITEA Symposium, a feature for “Capturing the Knowledge Base of Test and Evaluation” that will be part of the ITEA Test and Evaluation Handbook, and six technical articles.
President’s Corner, written by William Keegan, President of the Board of Directors for ITEA, covers the state of the association, ITEA accomplishments, and upcoming ITEA events.
Our Guest Editorial is an interview of Greg Zacharias, Ph.D., the Chief Scientist for the Director of Operational Test and Evaluation (DOT&E), conducted by Laura Freeman, Ph.D., titled “A Conversation with the DOT&E Chief Scientist.” In this article, Dr. Zacharias discusses his career in the Air Force, academic background, DOT&E initiatives, accelerating acquisition, use of modeling and simulation, human-systems integration, and autonomous systems testing.
The next feature is an article planned to be part of the ITEA Test and Evaluation (T&E) Handbook. For this issue, Laura Freeman, Ph.D., et al., in “Designing Experiments for Model Validation – The Foundations for Uncertainty Quantification,” state that the validity of information provided by models is dependent on the rigorous verification and validation processes used. The authors propose Design of Experiments and statistical analyses as methods to support statistical comparison of simulation to live data. (Note: Readers can suggest edits/additions to this article or to the draft outline of the handbook at CTEPfirstname.lastname@example.org. Readers can find the proposed outline for the handbook in Issue 39(4), December 2018, of The ITEA Journal of Test and Evaluation titled “Capturing the Knowledge Base of Test and Evaluation” on pages 215-216. The topic of this second article for The Handbook is listed under subheading of “Data Collection and Analysis” and topic “Statistics principles and tools.”)
Our next feature is a description of the venue for the 2019 Symposium. We will be holding this Symposium in the Pacific Command’s warfighter zone. I’ve always enjoyed ITEA events where DoD acquisition and industry leaders provide recommendations to T&E. In this case, it will be the warfighters providing some measure of feedback. The discussion in this article is about the venue in terms of military testing and experimentation and the significance of the DoD work being done in the region.
Subsequent articles in June and September will cover the event and the program development. Any tours we can arrange will also be covered in those issues. Our first of six technical articles, “The Effect of Extremes in Small Sample Size on Simple Mixed Models: A Comparison of Level-1 and Level-2 Size,” written by Kristina Carter, et al., summarizes a simulation study to examine the impact of small samples in observation and nesting levels of the model. The authors found that, in some cases, the rules of thumb for sample sizes could be reduced to still attain sufficient power in certain circumstances related to the type of factor, acceptability of a higher risk of a type I error, or when the minimum effect worthy of detection is large.
The second technical article in this issue, “Applying Scientific Test and Analysis Techniques to Defense Business Systems Test and Evaluation” written by William Rowell, Ph.D., states that there is interest in applying Scientific Test and Analysis Techniques (STAT) to test and evaluation of Defense Business Systems (DBS). The author has found that the application of STAT tools for T&E of DBS is an iterative process that, when applied correctly, ensures adequate coverage of inputs and use cases and facilitates optimal use of the test resources.
For our next article, “The Value of Scientific Test and Analysis Techniques in Test and Evaluation” by Terry Murphy and Kyle Kolsti, Ph.D., the authors propose methods to improve T&E outcomes. They describe how Scientific Test and Analysis Techniques (STAT) can be applied to the planning, execution, and assessment phases of a typical test program. They state that use of these methods will improve the analytical skills of the T&E workforce and provide higher quality information for decision makers.
In our fourth technical article, “Lessons Learned from an Incompletely Randomized Test Design,” Michael Harman illustrates how incomplete randomization and a flawed design can significantly impact the analysis and derived conclusions from the data. Intelligent selection of designs can reduce the sample sizes by more than 50% in the example used, account for the way the test will be executed, and improve the likelihood of a complete analysis using quantifiable information.
In the fifth technical article, “A Hybrid Approach to Big Data on High Performance Computing for Army Test and Evaluation,” Brian Panneton, et al., describe the prototype of a scalable big data stack within a traditional High Performance Computing (HPC) architecture, and they explain the rationale behind this work. The goal of the effort is to show an example of how big data architecture can be used throughout the United States Army. This work will take the current progress one step closer to having real-time analysis capability.
For the sixth and last article in this issue, “Know the SCOR for a Multifactor Strategy of Experimentation, Screening, Characterization, Optimization, and Ruggedness (SCOR) Testing,” Mark Anderson presents a paper highlighting how Design of Experiments can be used in a common sense way to step through modeling of the system to eventually identifying the design of a robust system. The author illustrates the process in steps and shows how this sequence can result in a system that is characterized and ruggedized.
I hope you enjoy this first issue of 2019 for The ITEA Journal of Test and Evaluation. By the time you receive issue 40-1 in March, the June 2019 issue 40-2 is being finalized. That theme in June will be “Accelerating Test and Evaluation with LVC and Agile.” For the next issue (the third issue of 2019), 40-3, the deadline for submissions is just after June 1, 2019, and the theme will be “Aligning Modernization of DoD Test Ranges with National Defense Strategy.” We have posted all themes and descriptions for the remainder of 2020-2021 on the ITEA website. Please provide feedback on the choice of themes, and please write early and often.