2018 Pre-Symposium Tutorials

2018 Pre-Symposium Tutorials

One (1) 4-hour Tutorial – $205, and Two (2) 4-hour Tutorials – $385

Register Online

Download Registration Form



MONDAY, DEC 10 – 1 to 5 PM

 

Building Better Models Using Robust Data Mining Methods – Tom Donnelly, PhD, CAP – SAS Institute Inc.

Through case studies, you’ll learn to build better and more robust models with advanced predictive modeling techniques. Featured methods will include many types of regression (linear, logistic, penalized), neural networks (single layer, dual layer, boosted), and decision trees (simple, bagged, boosted). To make these methods robust you’ll learn to split your data into training, validation (tuning) and test subsets to prevent over fitting. You will also see how to use graphical and statistical comparison techniques to help choose the best predictive model.

Featured case studies include building 10 surrogate models of a computer simulation of a helicopter flying surveillance and identifying the best predicting model of the various logistic, decision tree, neural, spline, and regression models, as well as analyzing the 1998 KDD Cup Cyber Attack Data set with over 40 possible causes of 20 types of attack and building a robust ensemble predictor model. This tutorial is for analysts, scientists, engineers and researchers interested in learning how predictive modeling can help them use the data they have today to better predict tomorrow.

 

Improvements in Distributed T&E Using TENA and JMETC – Gene Hudgins, KBR

Together, TENA and JMETC enable interoperability among ranges, facilities, and simulations in a timely and cost-efficient manner. TENA provides for real-time system interoperability, as well as interfacing existing range assets, C4ISR systems, and simulations; fostering reuse of range assets and future software systems. JMETC is a distributed, LVC capability which uses a hybrid network architecture; the JMETC Secret Network (JSN), based on the SDREN, is used for secret testing and the JMETC Multiple Independent Levels of Security (MILS) Network (JMN) is the T&E enterprise network solution for all classifications and cyber testing. JMETC provides readily-available connectivity to the Services’ distributed test and training capabilities and simulations, as well as industry resources.

This tutorial addresses using the well-established TENA and JMETC tools and capabilities to reduce risk in an often-uncertain environment; regularly saving ranges time and money in the process.

 

Introduction to Agile Test and Evaluation – Jennifer Rekas, The MITRE Corporation

Agile software engineering process models, such as Scrum, Kanban, or XP, have been a popular for several years. Originally, Agile testing practice was focused on individual software projects and how automated test could be accomplished for small teams. As Agile has become a more accepted process model, organizations look to scale it for larger, more complex systems that are not all software-based, as well as identify how to perform test and evaluation in an Agile context using DevOps technologies. This tutorial introduces several Agile and DevOps process concepts, with a focus on Test and Evaluation.

Topics for this lecture-based tutorial include:

– Review of the Agile process at the individual project level and scaled process models for larger systems

– Examples of agile testing practices

– Introduction to DevOps, particularly how test and evaluation fits into that paradigm

– Explore a case study of how agile test and evaluation was implemented on a large system of systems effort

 

Software Reliability Engineering in Agile Development – Robert Binder

Planning and performing testing to support software-related Reliability, Availability, and Maintainability (RAM) while following Agile development practices is terra incognita for many programs. This occurs because (1) the differences among RAM testing for materials, electronics, and software are not well-understood, (2) the recommended practices for Software Reliability Engineering (SRE) defined in recently revised IEEE Standard 1633 are not broadly disseminated, and (3) generally accepted testing practices of Agile software development or commercial methodologies such as SAFe do not in any way call for testing necessary to evaluate software reliability.

This tutorial was developed to educate participants about these issues and to present Reliability Driven Development (RDD), a systematic approach for achieving adequate software reliability testing while following Agile practices.

The specific goals of the tutorial are to provide: for Agile practitioners a practical summary of how to blend SRE practices with your Agile development approach, for Reliability Engineers considerations for applying SRE to Agile projects in your organization, for developers and managers an overview of the intersection of these two areas, and for testers at all stages, a practical and rigorous strategy for testing that will produce credible software reliability estimates.

 

Telemetry Over IP – Gary Thom – Delta Information Systems, Inc.

As telemetry ranges are making the move to network centric architectures, it is worth considering the lessons learned over the previous 10 years of designing, installing, troubleshooting and optimizing telemetry data distribution over IP networks. This tutorial will begin with the motivation for moving to Telemetry over IP (TMoIP). It will then provide a basic networking foundation for understanding TMoIP and TMoIP formats. With this basis, we will be able to discuss network design considerations and tradeoffs for a successful TMoIP deployment. Finally, we will present some of the real-world problems and issues that may arise in a TMoIP system and the troubleshooting techniques that can be used to resolve them.

 

Test and Evaluation: The Timeless Principles – Matthew T. Reynolds

Tutorial Description: This tutorial describes the policies and practices of T&E, particularly for novices and for those wanting a refresh. Given that the timeframe in focus for this symposium is 2025 and beyond, the broad landscape of T&E will be presented. Included will be discussions of how today’s T&E environment has evolved and how that evolution will need to continue. As the proverb says, “You won’t know where you’re going until you understand where you’ve been.”

Over the last half century, digital technology has dramatically changed nearly every facet of our lives. This is especially true of our national defense, where threats have been expanding rapidly and the battlefield is becoming global. T&E associated with the development and frequent modernization of today’s warfighting systems has had to overcome major challenges, just to keep pace with new threats, system designs and tactics. T&E programs have also had to anticipate future advances, and even to get ahead of them. Yet the fundamental principles of T&E have remained the same. This tutorial will describe them, and will identify corollaries in domains other than defense, such as communications, transportation, energy, and even consumer products. Key lessons learned and best practices will be described. Topics such as statistics-based test design, modeling and simulation, reliability test engineering, enterprise level approaches and cybersecurity verification will be discussed.



 

TUESDAY, DEC 11 – 8 AM – NOON

 

Process and Statistical Methods for M&S Validation – Laura Freeman, PhD, and Kelly Avery, PhD – Operational Evaluation Division Institute for Defense Analyses

When computer models and/or simulations are used to support evaluations of operational effectiveness, suitability, survivability, or lethality, they must be rigorously validated to ensure they are satisfactorily representing the real world in the context of the intended use. Data-based comparisons between the model and live testing are an essential component of a defensible validation strategy. Specifically, we need to understand and characterize the usefulness and limitations of the models, including quantifying uncertainty in the model based on comparisons to live testing. This tutorial motivates the importance of models and simulations in operational evaluations. It summarizes the key steps and processes of a defensible validation. We review multiple statistical design and analysis techniques that support rigorous validation and uncertainty quantification. A case study walks through potential design and analysis methods, highlighting the strengths and weakness of different techniques. Students will gain an appreciation for the different statistical methods available for validation and establish a framework for selecting the right methods based on the type of model and/or simulation that needs to be validated.

 

Processes for Testing with International Partners: Part III – Robert Butterworth & Gloria Deane (DOT&E), Wright Yarborough, PhD (A&S), Tom Bogar (OSD GC)

“Testing with our allies to assure interoperability is becoming the norm by necessity. Duplicative testing is inefficient for all nations, so sharing of “test resources” is highly desirable. “Test resources” include test facilities, open air ranges and operating areas, laboratories, equipment, expertise, methods, data, and funds. Upon making the decision to test, participants must complete certain administrative actions to implement a test program. To test with an international partner, an international agreement must be in force and the partnering nations must negotiate and approve a project arrangement. The laws of sovereign nations govern such activity and DOD has developed administrative processes to ensure statutory compliance. The Office of the Director, Operational Test and Evaluation (DOT&E) will offer a tutorial to inform members of the test community of the capabilities and limitations of the international Test and Evaluation Program and how to develop project arrangements bilaterally and with multiple partnering nations. Speakers will be representatives from the Office of the Director, International Cooperation in the Office of the Undersecretary of Defense for Acquisition and Sustainment, the International Test and Evaluation team within DOT&E, and international partners with whom the DOD test community has worked for many years.

 

Statistics Every T&E’r Needs for Critical Thinking – Mark Kiemele, PhD – Air Academy Associates

This tutorial will cover the need for critical thinking as well as a high-level view of a variety of data analytic tools that can be used to enhance critical thinking. In a data-driven economy, industry and government leaders rely increasingly on skilled professionals who can see the significance in data and use data analytic techniques to properly collect data, solve problems, create new opportunities, make better decisions and shape change. This tutorial is intended for executives, leaders, managers, and practitioners who need to know how their critical thinking can be enhanced by using some simple statistical concepts. The key takeaway that will be demonstrated is that statistical thinking is a necessary ingredient for effective critical thinking.

 

T&E as a part of Agile Development – Robin Poston, PhD, System Testing Excellence Program

To discuss T&E in support of agile development, we need to explore the sequence of the evolution of the agile methods, rationale for the application of different methods, compare traditional and agile software development approaches, discuss research conclusions regarding the agile method’s Impact on software performance, review benefits and challenges of agile, and appreciate the fit of agile methods with Software Development LifeCycle (SDLC) stages. Furthermore, in this tutorial we will also discuss when to use agile, the role of the tester on agile projects, and various kinds of testing applicable to agile software developments. The goal is for attendees to be able to evaluate whether requirements testing is being properly integrated into the agile software development process, coordinate development of the operational test strategy into the agile software development environment, coordinate and oversee testing in the agile development environment with government and contractor personnel, and specify testing requirements in the Request for Proposal (RFP) for a software development project in which the agile development process is to be used.

 

The Shallow End of Deep Learning: T&E for Artificial Intelligence – Chris Milroy, Turin Pollard, and Evelyn Rockwell – Alion Science & Technology Corporation (ALION)

The 2018 National Defense Strategy highlights artificial intelligence (AI) as one of the core technologies driving national security competition and as a modernization investment area. However, testing and evaluation techniques for defense applications of modern AI—particularly deep learning systems—have yet to evolve to meet the unique challenges and opportunities posed by the field. This tutorial assumes no prior background in the field and will give interested professionals the concepts, vocabulary, intuitions, and scientific foundations necessary to understand and apply the features of modern AI, including deep learning, to the T&E field.

The power of deep learning is derived from a system’s ability to discover and use patterns too complex to be captured in a compact description—the equations often encompass millions or billions of parameters, each of which can be updated with every training datapoint. This is AI’s central challenge to the T&E community: experimental methods and performance measures designed for traditional systems with less complex equations or rules, like physical systems, cannot be used to comprehensively test or evaluate deep learning systems. AI offers a significant opportunity for T&E, as well: harnessing deep learning systems to supplement tailored tests. For instance, generative models can synthesize test cases that are especially hard for the system under test, even though humans cannot easily describe what makes the cases hard, thereby elevating fuzzing and boundary scanning to a higher level. More broadly, AI offers a fundamentally new approach to testing and evaluation, and this tutorial will help bring participants into the conversation on developing that approach.

Upcoming Events