1999 Computer Game Developer's Conference
17 March 1999, San Jose, California


Fundamental Principles of Modeling and Simulation

Roger Smith

3481 Woodley Park Place

Oviedo, Florida 32765

(407) 977-3310

smithr@modelbenders.com

http://www.modelbenders.com/

Presentation Slides in PDF


 
 

Abstract

Creating interactive digital worlds in games, web spaces, and military simulations has always been a very customized process founded upon the experience of the individual designers and programmers involved. However, as more and more digital worlds are constructed some basic principles are emerging as common to all projects. This paper describes the fundamental principles that are being distilled from the development of products for gaming and military simulation. The paper will describe The Golden Rule of Modeling, Axioms to the Golden Rule, The 10 Commandments of Modeling, and the Laws of Data.

Introduction

The lessons presented in this paper were learned from many years of experience building simulations for the Department of Defense. Military simulation products are very much like computer games, and in some cases are the computer games of the future. Therefore, these lessons may have current and future applicability to game developers, just as they have guided military modelers in the past.

The military defines a model as "an abstraction that represents the state or behavior of a system to some degree". A simulation is "a complete system that exercises models for the purpose of training, analysis, or prediction". In this context it is important to point out that both models and simulations are expected to represent the objects and events of a virtual world accurately, though the level of detail may vary greatly. Simulations can be found that represent individual vehicles, their articulated parts, the physics of movement, and ballistic fly-outs of individual munitions. Other, equally accurate and useful simulations, represent hundreds of vehicles as a single icon, its combat capability as one variable, terrain as an enumeration covering multiple kilometers, and engagement as a force exchange ratio. In both cases, the fidelity of the system is less important than its consistency in capturing interactions between objects. Consistency of interaction allows the user to adjust their view of the world and enter it at the appropriate level.

Simulations vary widely, but beneath the surface they all begin as an exercise in capturing salient features of the real world and translating those into a virtual world. In all cases the process of doing this successfully follows some fundamental principles which are described in this paper. Of course, we do not pretend that every principle is presented here, only that these are a useful set that can serve as invaluable guidelines in developing simulations, models, games, virtual worlds, and digital playgrounds.

The Golden Rule of ModelingTM

There is one rule that far overshadows all others. This has been elevated to the status of The Golden Rule. It is a guiding light that is obvious once you hear it, unforgettable once you know it, easily propagated if you believe it, but quickly forgotten when you are absorbed in the process of creating a model, simulation, game, or virtual world.

The Golden Rule of Modeling

"A model has no inherent value of its own. The value of a model is based entirely upon the degree to which it solves someone's real-world problem."

"Obvious!" you say. "Who could forget that?" you ask. "We teach it to all our people," you claim. But how many products reflect it? How many programmers adhere to it? What is done to instill it? Like all golden rules, it is easy to accept, but hard to follow.

In the military realm, the Golden Rule directs us to consider the purpose for which the simulation is being acquired - such as training, analysis, or prediction. The Golden Rule dictates the level of fidelity necessary to solve the problem, the extra features that can be added to each model, the amount of data presented on a user interface, and hundreds of other characteristics. The Golden Rule drives us to build a simulation focused on the current and future problems that our customers will face. In computer games, the effects are the same, but may be expressed in different words because of the relationship to the marketplace of customers and the predominant mission of entertaining your customer.

There are two major offenders of the Golden Rule. The first is the manager or marketer who advertises features that do not exist, that are totally superfluous, or that disturb the interactive balance needed to insure a good model. Marketing meetings, press interviews, and product briefings have a life of their own and result in the creation of features that were never intended for the product. Unfortunately, once uttered, these descriptions must be made into software facts. This practice has been going on since the first program was written, but has become commonly recognized through hilarious features in the DilbertTM cartoon.

The second offender is the programmer who pushes interesting ideas, new algorithms, and secret capabilities into the software. Though exciting and challenging to add to the simulation, these features are not free. They add cost in development hours, CPU cycles, code complexity, maintenance, and justification when found out. The marketers have been vilified for their role in violating the Golden Rule of Modeling, but the programmers have remained relatively unscathed, retaining their image of clever nonconformists. Occasionally, violators are vindicated when the customer later demands the capabilities that entered the system this way. However, the added features are usually just burdens that dog the product throughout its life.

Axioms to the Golden Rule

Any universal rule worth its salt will generate axioms that further define the implications of the rule. These axioms describe specific applications of the golden rule or principles that follow if the rule is true.

Axiom #1: Models are not universally useful, but are designed for specific purposes.

Every customer or market segment has a different set of needs. Sometimes these needs are closely aligned with the needs of previous customers. More often, some segment of those needs is divergent from those of the original customer. Therefore, a model that was the perfect solution yesterday, may be totally inadequate today. There was a time when Lanchester’s differential equations (published in 1916) were the miracle cure for all direct fire combat modeling. But, the assumptions behind those equations grow less valid every day. Today, new methods are demanded for the same task. There was also a time when we were all mesmerized by the brilliance of text-based adventure games and Pong on the television sets. Divergence is always a function of time, but it is also a function of the domain in which the customer exists. Though QuakeTM may be a best selling shooter for male players, it is probably not the core for a market blockbuster aimed at female customers.

Axiom #2: A great model of the wrong problem will never be used.

There is a catalog describing nearly all of the simulations owned or operated by the Department of Defense. This catalog lists nearly a thousand systems, many of which were amazing solutions to a specific problem. However, they are usually custom crafted solutions for that specific problem. Once that problem is solved, the model is no longer of any use. The same happens if the problem transforms itself, as the dissolution of the Soviet Union has done to force-on-force combat models. Models that can not transform themselves as well will fall into the corners of dark closets, never to be fired up again.

Games face this same fate. Thousands of games are available for play, but only a few hit the customer’s needs and wants right on the head. Many models begin with a target in mind, but during the development process they lose sight of that target. These fall into the dark closet because they are pushing the hot buttons of someone besides the customer (probably that of the programmer or project management). If completed these systems are great solutions for a need that no one has.

Axiom #3: Learning to model is better than learning about models.

People who know all about old models, games, or techniques are excellent sources of ideas and lessons learned from the past. However, this knowledge must be combined with an understanding of the fundamental principles for creating a new model. Without this, the model historian will spend his or her life creating combinations of products that already exist. There is certainly a need for this. Every model or game can benefit from incorporating good ideas from other games. However, the state-of-the-art moves froward because people understand what is involved in the process of modeling or game design. They know what is essential and what is only a specific implementation. These people can invent the next generation, produce the blockbuster titles, and solve problems that no one else could crack. (Remember the first time you saw WolfensteinTM?)

The 10 Commandments of ModelingTM

Through years of experience and interviews with other long-time developers we have arrived at a list of ten principles for building a successful simulation product. These have been organized into the 10 Commandments of ModelingTM. Like the original 10 commandments, these are touchstones for success that can be kept in the forefront of your mind. But many other rules and guidelines are needed to support, enhance, and clarify these to help you create a great product.

I. Simplify, Simplify

When building a model, game, or simulation a good team can always envision and implement more details than are really necessary to make the product a success. The fertile brains and abundant energy of great people can always imagine and program much more than the customer has asked for, needs, or can appreciate.

The team must be bounded by the needs of the specific product. Additional great ideas should be captured and placed on the storyboard for the next product. If allowed to render every vision into software, the final product will be a bloated and confused medley of ideas that are not clearly tied together, or tied by the thinnest of threads. Great military simulations and computer games focus on a specific mission and do that job very well. Within the military, the Janus and ModSAF simulations have been extremely successful. Janus represents individual or aggregate objects on the battlefield and executes at very discrete time-steps. It is not a virtual simulation, uses poor graphics, has an archaic user interface, and requires prodigious amounts of time to build Pk tables for every possible interaction. But, it allows training and operational evaluation at a level that is needed by a large military audience. Similarly, ModSAF is a single CPU simulation that is more advanced than Janus, but is constantly criticized for what it can not do. Programmers who extend the AI in the system complain about the limits imposed by the Finite State Machine architecture and the inability to create complex, linked behaviors. However, the system is used on hundreds of projects and continues to be the most widely proliferated simulation within the Department of Defense. ModSAF meets a specific need in a convenient, usable, and modifiable package.

Around 1320 Sir William of Occam summed up the need for simplicity in what has become known as Occam’s Razor – "essentia non sunt multiplicanda praeter necessitatem", in English "hypotheses are not to be complicated without necessity". Dr. Robert Shannon, a pillar of the discrete event simulation community, has stated that "The tendency is nearly always to simulate too much detail rather than too little. Thus, one should always design the model around questions to be answered rather than imitate the real system exactly." This wise advise is emphasized by Albert Einstein himself, who maintained that "Everything should be made as simple as possible – but no simpler."

II. Learn from the Past

Successful systems of the past were built by very intelligent and energetic people working with the best tools available at the time. They arrived at solutions that would fit into the computer available and applied considerable ingenuity and feats of engineering to achieve this. It is easy to look backward and smile at those primitive products. But, within each of them are nuggets of gold that should be mined when creating a new system.

Legacy systems, as they are called in the military, are packed with good ideas that can be reused. Compact solutions to complex problems are embedded in every algorithm. Even military simulations have a "ship date" at which version 1.0 is delivered to the customer. However, these systems continue to grow and improve for decades. It is not unusual to find a 1970’s era FORTRAN simulation running on a DEC VAX. But, the software will have been improving and maturing internally for 20 years and is far beyond the capabilities delivered in version 1.0.

Model developers who study these are continually amazed by the complex virtual world that has been squeezed into these old machines. The creativity born of limited resources is capable of achieving what appears impossible to the general observer.

Of course, these old systems provide lessons on what not to do as well. The same traps and snares that snagged your predecessor a decade before are waiting for the new developer. Ask yourself why your predecessor did not use some of the ides you are considering.

III. Create a Conceptual Model

A team of young, energetic, talented programmers are always eager to start programming immediately. This admirable quality must be harnessed and directed toward the very difficult process of creating a conceptual model that will serve as the blueprint and foundation for the product. This is a part of the design process that attempts to capture the characteristics of the real world that will be represented in the software.

Conceptual modeling consists of selecting the objects, attributes, events, and interactions that will form the product. Without resorting to programming, you want to identify and define a set of these that work together to form a complete, complimentary, and efficient product. When creating a virtual world there are an uncountable number of combinations of characteristics and intentions. Some are empowering, some inert, and some fatal. A working conceptual model will define a virtual world that operates efficiently and appears to be complete and consistent. Designers can experiment with new ideas and trace their impacts on other algorithms within the system. Constant experimentation arrives at a package that is the best that can be found, and does so without the long development times needed to do so in software.

IV. Build a Prototype

One of the reasons teams skip the Conceptual Model phase is the extreme difficulty of mentally envisioning and defining an entire virtual world and the infrastructure that will support it. But, having worked through that process there are always questions and assumptions that can not be evaluated without working software. A prototype should be written to explore these dark corners of the conceptual model. There is no need for a prototype to look like the final product. It must enlighten the programmers who are about to jump into the problem, give them ideas, options, and tools to find the best solution to the problem.

An engineering prototype has the same objective as a conceptual model – to clarify the structure, algorithms, and capabilities of the final product. As essential as this mission is, it must be bounded in time and money relative the final schedule and budget. Both steps are essential, but neither produces a final product that can be shipped. These are tools to help create a better product, not substitutes or excuses for avoiding product development. Neither, can you expect these to iron out all of the problems, questions, and mistakes that will be encountered when programming the simulation – they just help reduce the number and severity of future problems.

Finally, to quote Bill Joy of Sun Microsystems, "Large successful systems come from small successful systems." So where do you think large failures come from?

V. Push the User’s Hot Buttons

The game community appears to be better at this principle than the military simulation community. The desire for a beautiful work of engineering genius that will be admired by your peers sometimes leads to products that are perfect at solving the wrong problem. There are many simulation systems that are never used because they solved a problem that no one has.

The development team must be in touch with the customer and understand what gets them excited. When they use a model today, what really turns them on? What makes their job easier? What makes them recommend the product to others? What infuriates them about current models? What are they trying to do, but are thwarted by the limitations of the model? What is dead wrong, laughable, and embarrassing about their current set of tools?

Your new product must capture the success of the old products, but overcome their limitations. Capturing success does not mean duplicating the product (though a copycat product is sometimes the solution), but requires that you achieve the same level of user excitement.

It is easy to fall into the trap of creating a product that the developer wants rather than what the customer wants. But the market base for that product is extremely small.

VI. Model to Data Available

Military simulations for training, analysis, and prediction must accurately capture the performance and behavior of existing systems. These simulations may be used to ingrain life-and-death behaviors in soldiers, guide multi-million dollar purchasing decisions, or direct the future structure of the US military. If they are not accurate, the results can be catastrophic. Therefore, the models must be based on known characteristics and behaviors of the real systems being replicated. But data on these systems is scarce. During a real war the emphasis on capturing data objectively for future decision-makers is overshadowed by the need to stay alive and accomplish the mission. As a result we have a very limited set of quantifiable information about how combat works and how battles unfold.

Model developers need to be aware of the databases that exist in their areas. They need to understand what data exists and what data is totally unavailable. Every software model or game requires data that does not exist in any official or unofficial form. Every model requires that data be synthesized from what is available and from the subjective experiences of soldiers who have performed the operations. But, an effort needs to be made to provide a foundation for the model based on the scarce data that does exist.

VII. Separate Data from Software

If you read simulation code back through time you will see that we have been learning this lesson for 20 years. In the past, the budget for CPU and memory dictated very terse implementations of models. As a result, these tended to be made up of algorithms that had been tuned to the specific situation for which it would be used. Changing the situation required changing the software. However, thanks to improvements from the hardware industry we can now afford the luxury of moving some of our assumptions and system tuning into data that can be changed by the team or by the customer. This results in a product that is much more flexible and valuable to the user.

Even games now allow the user to create their own scenarios, to add new models, and to modify the visual scenes. This power is one of the user’s hot buttons and can only be pushed when the models are driven by data that is accessible to non-programmers.

To paraphrase Art Linkletter, "User’s change the darndest things." As programmers we often underestimate the creativity and cleverness of a dedicated user. Who could have imagined all of the Quake conversions that have emerged? We are just now coming to appreciate this and support it with data driven models, scripting languages, and tools to safely manipulate this data.

VIII. Trust Your Creative Juices

When working with a new team that has not created a simulation before, I notice that they are afraid to move forward without explicit direction and definition about what they should build. They are afraid that they will head off in the wrong direction and create a product that others will criticize. This fear of criticism is more crippling than their aversion to reworking a program that has gone wrong.

Experienced members of the team must demonstrate, instill, and encourage the brave act of trusting your own creative juices. The team leaders must provide the vision for the entire product, but each programmer, designer, and artists must have the freedom and confidence to express their vision in the product.

On military projects the fear of making mistakes results in constant repetition of requirements analysis, organizational restructuring, product research, and unproductive meetings. The team avoids making concrete decisions about the design of the product. They will not allow programmers to finish a conceptual model or build a prototype. Thousands of man-hours can be wasted in this trap. But eventually this cycle will be broken by one of the following events:

Good leaders will not abide remaining in this trap. Experienced programmers can not stand to vacillate around a problem they know how to solve. Self-confident programmers (new and experienced) will march somewhere of their own accord. If your team does not trust its own creative juices and abilities you either have a poor leader, an unskilled team, or a stifling organization.

IX. Fit Universal Constraints

Every product is bounded by the universal constraints of

When you run out of one of these, the product is finished regardless of any software details. Managers have been taught to fit products into the bounds of the first three, but are largely unaware of the fourth.

The quality, detail, and capabilities of a simulation or game are unlimited in and of themselves. The time to produce the product dictates the level of quality in its many forms. The amount of money available limits the size of the team and is tied directly to the time factor (since we all expect to get paid every month). These three constraints are preached in multiple management courses and textbooks and are applied to every form of product under the sun.

However, there is a fourth constraint – competence. Some projects require skills that are in short supply. Therefore, a generously funded project with a long schedule may still be strangled by the inability to hire people with the skills needed to do the work. Good leaders, programmers, designers, and artists are not available to do all of the work that companies want done. As a result some projects are understaffed, others are staffed with incompetent people.

A successful project must fit into the boundaries formed by all four of these constraints.

X. Distill Your Own Commandments

We opened this discussion emphasizing that there are many more than ten principles of modeling. The nine listed so far have been derived from the experiences of very talented people. However, the readers of this paper have a rich pool of their own experiences. That pool contains valuable lessons that fit into your current project, profession, or hobby. Each of you should distill your own set of commandments which you will use to avoid making the same mistakes you have made in the past, to gravitate toward what you know can be successful projects, and to create a working environment that is productive, rewarding, and profitable. Place confidence in your own lessons, trying not to repeat them throughout your career. These will be with you forever and you can not count outsiders to solve your problems for you and guide your career.

The Laws of Data

When building a virtual version of the real world it is essential that you be able to describe the real world in some numerical or rule-based form than can be coded into software. Without this, the models are always a shot in the dark. Since a model is a dynamic picture of the behavior of a system, it is very difficult to evaluate how accurate it is. The initialization data that starts up the system is one indicator of how accurate it can be. It is certainly not the only, or even the strongest, indicator. But it is a very measurable and tactile indicator. When the model or game is running, events move too quickly to provide a good feel for their validity. This leads to the capture of the information in a log file that can be studied closely and slowly.

Military simulations may be much more finicky about the accuracy of their data than are computer games. But, as the virtual worlds in games become richer and more interactive, the need for accurate data that interacts in reasonable and realistic patterns will increase. Customers get a feel for the realism of the simulation based on its interactions more than its static appearance (in the form of initialization data or screen shots). Creating believable or realistic models is predicated on capturing the characteristics, behaviors, and interactions of the real world - or creating entirely synthetic laws of physics under which to operate. The latter is a much more difficult task, so developers tend to prefer the former. When collecting accurate data upon which to base a model you will find the following four laws in effect.

First Law. You can never get all of the data you need.

No matter what level of detail you are building, a complete set of data has not been collected, organized, and cataloged to meet your needs. The people who have collected data that is available usually did so for a study, model, or game they were focused on. As a result, their data never covers all of the aspects of your virtual world.

Second Law. You can not use all of the data you can get.

The First Law of Data is not an indication that data on any subject is scarce. In fact, the world is awash in sea data. However, much of this information is overlapping and contradictory. It also presents an aspect of the problem that is of no interest to you. This often makes it impossible to combine the data from a number of sources into a single complete description. Of course, there is always that bucket of data that is of no use to anyone at all.

Third Law. You always have to collect some of your own data.

The first two laws make it very clear that you are going to have to do some data collection yourself. This may involve measuring or observing the process of cutting a path through the forest, assaulting an embassy with a team of soldiers, or walking horses across a muddy bog. Hopefully, once you have the information, you will share it with the world, pushing back the boundaries of the unmeasured and uncataloged.

Fourth Law. You always have to synthesize data to meet the needs of your model.

As willing as you may be to collect data on the behavior of a system, there is some data that is impossible to come by. It is unlikely that you will conduct experiments to discover the thickest wall that can be penetrated by a karate kick, measure the survival time of human flesh in a pool of lava, or find out how far a pig can free fall without becoming bacon.

Every model or game contains data that is synthesized by the model developers. This may be based on principles of physics, extrapolations of experiments, informed speculation, or pure fantasy. No model can get by without the creative bravery of a few people willing to make a guess. In some circles this process is frowned upon and its practice is covered with the most arcane scientific explanations. But, in truth, it is an honest part of the business and should be accepted as such.

Conclusion

This paper has presented lessons learned from years of excitement, productivity, suffering, and stagnation. The author and those consulted have learned about success and failure the hard way – through experience. But we have also learned from the experience of others. It is our hope that you will benefit from these lessons and spend more time on successful projects, abandon failures as soon as possible, refuse to follow incompetent leaders, and create greater products than we have.

Trademarks

The Golden Rule of Modeling and The 10 Commandments of Modeling are trademarks of Roger Smith. Dilbert is a trademark of United Features Syndicate. Quake and Wolfenstein are trademarks of id Software.

References

Hughes, Wayne P. Editor. 1997. Military Modeling for Decision Making, Third Edition. Military Operations Research Society, Alexandria, Virginia.

Law, Averill and Kelton, W. David. 1991. Simulation Modeling and Analysis. McGraw Hill. New York, NY.

Smith, Roger. 1998. Military Simulation Techniques & Technology. 3-day Course Notebook. http://www.magicnet.net/~smithr/mstt

Speaker: Roger Smith

Roger Smith is the Technical Director for STAC Technologies, an Adjunct Professor at the Florida Institute of Technology, a Consultant for Distributed Simulation Technologies, and a devout student and teacher of simulation technology. He is actively involved in designing, programming, and fielding simulations for the Department of Defense, having contributed to flight simulators, wargames, computer generated forces, and intelligence simulations. He is the author of over 30 papers on simulation-related topics, the editor of multiple conference proceedings, and the creator of three intensive short-courses on modeling and simulation.



© Copyright 1999, Roger D. Smith