Table Of Contents

- Manual
- Getting Started
- Starting the Program
- Retrieving Data
- Manipulating Data
- The Variable List
- The Variable List Menu
- Filter Observations/Selecting
- Add New Variables
- Delete Variables
- Edit Metadata
- Set Replicate Weights
- New Variable Reserve
- Edit Value Labels
- Dummy Code Categorical Variable
- Collapse Categories of Categorical Variable
- Set Missing Values
- The Expression Evaluator

- Saving and Re-running Actions

- Sampling
- Procedures
- Measurement Models
- MML Models for Test Data
- Other Available Procedures

- Graphics
- Tools
- Estimation Methods
- Optimization Techniques
- Variance Estimation

- Post-hoc Procedures
- More user input instructions
- The User Interface
- Input Instructions
- Options
- Output Precision

- Glossary of Terms and Symbols

- Getting Started

Bayesian Estimation

Bayesian estimation provides an alternative to the classical or sampling theory approach. In contrast to classical statistical inference where population parameters are treated as fixed, Bayesian inference involves assigning probabilities to the possible values of a population parameter, and through Bayes' theorem modifying these probabilities based on the evidence contained in the

In classical statistical inference, the parameters of the population are considered to be fixed, and the particular distribution that results from a given sample is taken as a variable. Bayesian inference, by contrast, treats the data that one has observed as given and treats the actual parameters as likely as any other. One then modifies this distribution with the likelihood function drawn from the data to create a new distribution, known as the *posterior distribution*, from which one can draw *posterior probabilities* for

the values of the

In a situation where we want to know the probabilities of each of several *P*'s that are, taken together, exclusive and exhaustive, the probability of *D* can be written as a weighted sum of the conditional probabilities of *D* given each *P _{i}* where the weights are the probabilities for each

which is Bayes' theorem for the discrete case with *k* different *P*'s.

In the original applications of Bayes' theorem, the *P*'s represented different populations; however, they can also represent different possible values for a given parameters of a single population. In either case, *D* represents the data being used in the estimation of the model.

If the possible values of some characteristic of a population are represented by a continuous variable rather than a set of *k* discrete possible values, the probability that the parameters has a particular discrete value becomes zero. Thus, each *P _{i}* becomes a range, and Pr

Bayes, T. (1763). Essay towards solving a problem in the doctrine of chances. *Philosophical Transactions. Royal Society*, London 53, 370-418. (Reprinted in *Biometrika*, 1958, 45, 293-315).

Box, G. E. P., & Tiao, G. C. (1973). *Bayesian Inference in Statistical Analysis*. Reading, MA: Addison-Wesley.

Iversen, G. R. (1984) Bayesian statistical inference. *Sage University Paper 96: Quantitative Applications in the Social Sciences*.

Bayesian inference is used in NAEP to obtain estimates of the parameters of the IRT models.