4621:
4066:
718:-1 compared against it, one at a time. The IIA hypothesis is a core hypothesis in rational choice theory; however numerous studies in psychology show that individuals often violate this assumption when making choices. An example of a problem case arises if choices include a car and a blue bus. Suppose the odds ratio between the two is 1 : 1. Now if the option of a red bus is introduced, a person may be indifferent between a red and a blue bus, and hence may exhibit a car : blue bus : red bus odds ratio of 1 : 0.5 : 0.5, thus maintaining a 1 : 1 ratio of car : any bus while adopting a changed car : blue bus ratio of 1 : 0.5. Here the red bus option was not in fact irrelevant, because a red bus was a
5922:
4616:{\displaystyle {\begin{aligned}{\frac {e^{({\boldsymbol {\beta }}_{c}+C)\cdot \mathbf {X} _{i}}}{\sum _{k=1}^{K}e^{({\boldsymbol {\beta }}_{k}+C)\cdot \mathbf {X} _{i}}}}&={\frac {e^{{\boldsymbol {\beta }}_{c}\cdot \mathbf {X} _{i}}e^{C\cdot \mathbf {X} _{i}}}{\sum _{k=1}^{K}e^{{\boldsymbol {\beta }}_{k}\cdot \mathbf {X} _{i}}e^{C\cdot \mathbf {X} _{i}}}}\\&={\frac {e^{C\cdot \mathbf {X} _{i}}e^{{\boldsymbol {\beta }}_{c}\cdot \mathbf {X} _{i}}}{e^{C\cdot \mathbf {X} _{i}}\sum _{k=1}^{K}e^{{\boldsymbol {\beta }}_{k}\cdot \mathbf {X} _{i}}}}\\&={\frac {e^{{\boldsymbol {\beta }}_{c}\cdot \mathbf {X} _{i}}}{\sum _{k=1}^{K}e^{{\boldsymbol {\beta }}_{k}\cdot \mathbf {X} _{i}}}}\end{aligned}}}
6624:
5298:
6121:
5917:{\displaystyle {\begin{aligned}\Pr(Y_{i}=1)&=\Pr(Y_{i,1}^{\ast }>Y_{i,2}^{\ast }{\text{ and }}Y_{i,1}^{\ast }>Y_{i,3}^{\ast }{\text{ and }}\cdots {\text{ and }}Y_{i,1}^{\ast }>Y_{i,K}^{\ast })\\\Pr(Y_{i}=2)&=\Pr(Y_{i,2}^{\ast }>Y_{i,1}^{\ast }{\text{ and }}Y_{i,2}^{\ast }>Y_{i,3}^{\ast }{\text{ and }}\cdots {\text{ and }}Y_{i,2}^{\ast }>Y_{i,K}^{\ast })\\\cdots &\\\Pr(Y_{i}=K)&=\Pr(Y_{i,K}^{\ast }>Y_{i,1}^{\ast }{\text{ and }}Y_{i,K}^{\ast }>Y_{i,2}^{\ast }{\text{ and }}\cdots {\text{ and }}Y_{i,K}^{\ast }>Y_{i,K-1}^{\ast })\\\end{aligned}}}
678:(also known as features, explanators, etc.), which are used to predict the dependent variable. Multinomial logistic regression is a particular solution to classification problems that use a linear combination of the observed features and some problem-specific parameters to estimate the probability of each particular value of the dependent variable. The best values of the parameters for a given problem are usually determined from some training data (e.g. some people for whom both the diagnostic test results and blood types are known, or some examples of known words being spoken).
6619:{\displaystyle {\begin{aligned}\Pr(Y_{i}=1)&=\Pr(Y_{i,1}^{\ast }>Y_{i,k}^{\ast }\ \forall \ k=2,\ldots ,K)\\&=\Pr(Y_{i,1}^{\ast }-Y_{i,k}^{\ast }>0\ \forall \ k=2,\ldots ,K)\\&=\Pr({\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}+\varepsilon _{1}-({\boldsymbol {\beta }}_{k}\cdot \mathbf {X} _{i}+\varepsilon _{k})>0\ \forall \ k=2,\ldots ,K)\\&=\Pr(({\boldsymbol {\beta }}_{1}-{\boldsymbol {\beta }}_{k})\cdot \mathbf {X} _{i}>\varepsilon _{k}-\varepsilon _{1}\ \forall \ k=2,\ldots ,K)\end{aligned}}}
1080:"experiments" — although an "experiment" may consist of nothing more than gathering data. The goal of multinomial logistic regression is to construct a model that explains the relationship between the explanatory variables and the outcome, so that the outcome of a new "experiment" can be correctly predicted for a new data point for which the explanatory variables, but not the outcome, are available. In the process, the model attempts to explain the relative effect of differing explanatory variables on the outcome.
514:
2150:
7577:) that serve as predictors. However, learning in such a model is slower than for a naive Bayes classifier, and thus may not be appropriate given a very large number of classes to learn. In particular, learning in a Naive Bayes classifier is a simple matter of counting up the number of co-occurrences of features and classes, while in a maximum entropy classifier the weights, which are typically maximized using
43:
6800:, i.e. it shifts the mean by a fixed amount, and if two values are both shifted by the same amount, their difference remains the same. This means that all of the relational statements underlying the probability of a given choice involve the logistic distribution, which makes the initial choice of the extreme-value distribution, which seemed rather arbitrary, somewhat more understandable.
2977:
706:(IIA), which is not always desirable. This assumption states that the odds of preferring one class over another do not depend on the presence or absence of other "irrelevant" alternatives. For example, the relative probabilities of taking a car or bus to work do not change if a bicycle is added as an additional possibility. This allows the choice of
1837:
4974:
4792:
2327:
3252:
3965:
3399:
962:
that is broken down into a series of submodels where the prediction of a given submodel is used as the input of another submodel, and that prediction is in turn used as the input into a third submodel, etc. If each submodel has 90% accuracy in its predictions, and there are five submodels in series,
6901:
This means that the effect of using an error variable with an arbitrary scale parameter in place of scale 1 can be compensated simply by multiplying all regression vectors by the same scale. Together with the previous point, this shows that the use of a standard extreme-value distribution (location
725:
If the multinomial logit is used to model choices, it may in some situations impose too much constraint on the relative preferences between the different alternatives. It is especially important to take into account if the analysis aims to predict how choices would change if one alternative were to
6934:
are determined for all independent variables for each category of the dependent variable with the exception of the reference category, which is omitted from the analysis. The exponential beta coefficient represents the change in the odds of the dependent variable being in a particular category
4664:(or alternatively, one of the other coefficient vectors). Essentially, we set the constant so that one of the vectors becomes 0, and all of the other vectors get transformed into the difference between those vectors and the vector we chose. This is equivalent to "pivoting" around one of the
957:
given the measured characteristics of the observation. This provides a principled way of incorporating the prediction of a particular multinomial logit model into a larger procedure that may involve multiple such predictions, each with a possibility of error. Without such means of combining
3833:
2773:
6110:
2145:{\displaystyle \Pr(Y_{i}=K)\,=\,1-\sum _{j=1}^{K-1}\Pr(Y_{i}=j)\,=\,1-\sum _{j=1}^{K-1}{\Pr(Y_{i}=K)}\;e^{{\boldsymbol {\beta }}_{j}\cdot \mathbf {X} _{i}}\;\;\Rightarrow \;\;\Pr(Y_{i}=K)\,=\,{\frac {1}{1+\sum _{j=1}^{K-1}e^{{\boldsymbol {\beta }}_{j}\cdot \mathbf {X} _{i}}}}}
2640:
The reason why we need to add a term to ensure normalization, rather than multiply as is usual, is because we have taken the logarithm of the probabilities. Exponentiating both sides turns the additive term into a multiplicative factor, so that the probability is just the
1681:
7274:
1098:
The observed outcomes are the party chosen by a set of people in an election, and the explanatory variables are the demographic characteristics of each person (e.g. sex, race, age, income, etc.). The goal is then to predict the likely vote of a new voter with given
1822:
5130:
2753:
4803:
4678:
2162:
3533:
967:
and is a serious problem in real-world predictive models, which are usually composed of numerous parts. Predicting probabilities of each possible outcome, rather than simply making a single optimal prediction, is one means of alleviating this issue.
945:, etc.) is the procedure for determining (training) the optimal weights/coefficients and the way that the score is interpreted. In particular, in the multinomial logit model, the score can directly be converted to a probability value, indicating the
858:
3099:
2526:
7408:
1323:
3681:
is significantly less than the maximum of all the values, and will return a value close to 1 when applied to the maximum value, unless it is extremely close to the next-largest value. Thus, the softmax function can be used to construct a
7552:
3844:
3264:
3061:
3075:, which is the variable over which the probability distribution is defined. However, it is definitely not constant with respect to the explanatory variables, or crucially, with respect to the unknown regression coefficients
753:
There are multiple equivalent ways to describe the mathematical model underlying multinomial logistic regression. This can make it difficult to compare different treatments of the subject in different texts. The article on
686:
The multinomial logistic model assumes that data are case-specific; that is, each independent variable has a single value for each case. As with other types of regression, there is no need for the independent variables to be
1071:
possible values. These possible values represent logically separate categories (e.g. different political parties, blood types, etc.), and are often described mathematically by arbitrarily assigning each a number from 1 to
2972:{\displaystyle 1=\sum _{k=1}^{K}\Pr(Y_{i}=k)\;=\;\sum _{k=1}^{K}{\frac {1}{Z}}e^{{\boldsymbol {\beta }}_{k}\cdot \mathbf {X} _{i}}\;=\;{\frac {1}{Z}}\sum _{k=1}^{K}e^{{\boldsymbol {\beta }}_{k}\cdot \mathbf {X} _{i}}}
1450:
3703:
6909:
Because only differences of vectors of regression coefficients are used, adding an arbitrary constant to all coefficient vectors has no effect on the model. This means that, just as in the log-linear model, only
5194:
3650:
5933:
6786:
1555:
1091:(possibly including "no disease" and/or other related diseases) in a set of patients, and the explanatory variables might be characteristics of the patients thought to be pertinent (sex, race, age,
6899:
7093:
6849:
6126:
5303:
4683:
4071:
6733:
6682:
1700:
4662:
2635:
5036:
4969:{\displaystyle \Pr(Y_{i}=k)={\frac {e^{{\boldsymbol {\beta }}'_{k}\cdot \mathbf {X} _{i}}}{1+\sum _{j=1}^{K-1}e^{{\boldsymbol {\beta }}'_{j}\cdot \mathbf {X} _{i}}}}\;\;\;\;,\;\;k\leq K}
2651:
1482:
4787:{\displaystyle {\begin{aligned}{\boldsymbol {\beta }}'_{k}&={\boldsymbol {\beta }}_{k}-{\boldsymbol {\beta }}_{K}\;\;\;,\;k<K\\{\boldsymbol {\beta }}'_{K}&=0\end{aligned}}}
2322:{\displaystyle \Pr(Y_{i}=k)={\frac {e^{{\boldsymbol {\beta }}_{k}\cdot \mathbf {X} _{i}}}{1+\sum _{j=1}^{K-1}e^{{\boldsymbol {\beta }}_{j}\cdot \mathbf {X} _{i}}}}\;\;\;\;,\;\;k<K}
6994:
646:, meaning that it falls into any one of a set of categories that cannot be ordered in any meaningful way) and for which there are more than two categories. Some examples would be:
3247:{\displaystyle \Pr(Y_{i}=k)={\frac {e^{{\boldsymbol {\beta }}_{k}\cdot \mathbf {X} _{i}}}{\sum _{j=1}^{K}e^{{\boldsymbol {\beta }}_{j}\cdot \mathbf {X} _{i}}}}\;\;\;\;,\;\;k\leq K}
3410:
4060:
separately identifiable vectors of coefficients. One way to see this is to note that if we add a constant vector to all of the coefficient vectors, the equations are identical:
3586:
783:
7082:
5215:, where there is some randomness in the actual amount of utility obtained, which accounts for other unmodeled factors that go into the choice. The value of the actual variable
2423:
5242:
is then determined in a non-random fashion from these latent variables (i.e. the randomness has been moved from the observed outcomes into the latent variables), where outcome
1515:
7319:
5282:
1160:
7032:
6922:(the first, i.e. maximum) of a set of values. However, it can be shown that the resulting expressions are the same as in above formulations, i.e. the two are equivalent.
1359:
4002:
60:
2559:
1144:
5240:
4008:. This is due to the fact that all probabilities must sum to 1, making one of them completely determined once all the rest are known. As a result, there are only
3678:
963:
then the overall model has only 0.9 = 59% accuracy. If each submodel has 80% accuracy, then overall accuracy drops to 0.8 = 33% accuracy. This issue is known as
4058:
4032:
3960:{\displaystyle \Pr(Y_{i}=c)=\operatorname {softmax} (c,{\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i},\ldots ,{\boldsymbol {\beta }}_{K}\cdot \mathbf {X} _{i})}
7314:
7294:
3394:{\displaystyle \Pr(Y_{i}=c)={\frac {e^{{\boldsymbol {\beta }}_{c}\cdot \mathbf {X} _{i}}}{\sum _{j=1}^{K}e^{{\boldsymbol {\beta }}_{j}\cdot \mathbf {X} _{i}}}}}
577:, i.e. with more than two possible discrete outcomes. That is, it is a model that is used to predict the probabilities of the different possible outcomes of a
7417:
1095:, outcomes of various liver-function tests, etc.). The goal is then to predict which disease is causing the observed liver-related symptoms in a new patient.
2988:
1076:. The explanatory variables and outcome represent observed properties of the data points, and are often thought of as originating in the observations of
2767:
for the distribution. We can compute the value of the partition function by applying the above constraint that requires all probabilities to sum to 1:
544:
2334:
703:
454:
3828:{\displaystyle f(k)={\begin{cases}1\;{\textrm {if}}\;k=\operatorname {\arg \max } (x_{1},\ldots ,x_{n}),\\0\;{\textrm {otherwise}}.\end{cases}}}
1387:
107:
7630:
6902:
0, scale 1) for the error variables entails no loss of generality over using an arbitrary extreme-value distribution. In fact, the model is
758:
presents a number of equivalent formulations of simple logistic regression, and many of these have analogues in the multinomial logit model.
4979:
Other than the prime symbols on the regression coefficients, this is exactly the same as the form of the model described above, in terms of
79:
2378:
699:
is assumed to be relatively low, as it becomes difficult to differentiate between the impact of several variables if this is not the case.
444:
6789:
5138:
6105:{\displaystyle \Pr(Y_{i}=k)\;=\;\Pr(\max(Y_{i,1}^{\ast },Y_{i,2}^{\ast },\ldots ,Y_{i,K}^{\ast })=Y_{i,k}^{\ast })\;\;\;\;,\;\;k\leq K}
933:
The difference between the multinomial logit model and numerous other methods, models, algorithms, etc. with the same basic setup (the
86:
7888:
2366:
of the weights to prevent pathological solutions (usually a squared regularizing function, which is equivalent to placing a zero-mean
3591:
7738:
7690:
126:
2373:
on the weights, but other distributions are also possible). The solution is typically found using an iterative procedure such as
6930:
When using multinomial logistic regression, one category of the dependent variable is chosen as the reference category. Separate
2764:
2414:
408:
93:
1676:{\displaystyle \ln {\frac {\Pr(Y_{i}=k)}{\Pr(Y_{i}=K)}}\,=\,{\boldsymbol {\beta }}_{k}\cdot \mathbf {X} _{i}\;\;\;\;,\;\;k<K}
656:
In a hands-free mobile phone dialing application, which person's name was spoken, given various properties of the speech signal?
459:
397:
192:
7269:{\displaystyle L=\prod _{i=1}^{n}P(Y_{i}=y_{i})=\prod _{i=1}^{n}\left(\prod _{j=1}^{K}P(Y_{i}=j)^{\delta _{j,y_{i}}}\right),}
6738:
319:
64:
6854:
662:
Which country will a firm locate an office in, given the characteristics of the firm and of the various candidate countries?
75:
6810:
4672:-1 choices are, relative to the choice we are pivoting around. Mathematically, we transform the coefficients as follows:
3691:
2374:
278:
6918:
Actually finding the values of the above probabilities is somewhat difficult, and is a problem of computing a particular
7883:
7756:
6687:
6636:
2363:
1817:{\displaystyle \Pr(Y_{i}=k)\,=\,{\Pr(Y_{i}=K)}\;e^{{\boldsymbol {\beta }}_{k}\cdot \mathbf {X} _{i}}\;\;\;\;,\;\;k<K}
996:
possible outcomes rather than just two. The following description is somewhat shortened; for more details, consult the
942:
537:
674:
to be predicted that comes from one of a limited set of items that cannot be meaningfully ordered, as well as a set of
7562:
2382:
601:
480:
5125:{\displaystyle Y_{i,k}^{\ast }={\boldsymbol {\beta }}_{k}\cdot \mathbf {X} _{i}+\varepsilon _{k}\;\;\;\;,\;\;k\leq K}
7893:
7618:
5197:
4629:
2748:{\displaystyle \Pr(Y_{i}=k)={\frac {1}{Z}}e^{{\boldsymbol {\beta }}_{k}\cdot \mathbf {X} _{i}}\;\;\;\;,\;\;k\leq K}
2571:
1691:
transform commonly used in compositional data analysis. In other applications it’s referred to as “relative risk”.
762:
667:
566:
449:
418:
345:
53:
3087:
1109:
766:
726:
disappear (for instance if one political candidate withdraws from a three candidate race). Other models like the
688:
574:
439:
428:
392:
299:
6935:
vis-a-vis the reference category, associated with a one unit change of the corresponding independent variable.
1458:
500:
7570:
7035:
2562:
578:
371:
294:
187:
166:
1537:
independent binary logistic regression models, in which one outcome is chosen as a "pivot" and then the other
3528:{\displaystyle \operatorname {softmax} (k,x_{1},\ldots ,x_{n})={\frac {e^{x_{k}}}{\sum _{i=1}^{n}e^{x_{i}}}}}
100:
6949:
853:{\displaystyle \operatorname {score} (\mathbf {X} _{i},k)={\boldsymbol {\beta }}_{k}\cdot \mathbf {X} _{i},}
530:
423:
6914:-1 of the coefficient vectors are identifiable, and the last one can be set to an arbitrary value (e.g. 0).
4991:
It is also possible to formulate multinomial logistic regression as a latent variable model, following the
2521:{\displaystyle \ln \Pr(Y_{i}=k)={\boldsymbol {\beta }}_{k}\cdot \mathbf {X} _{i}-\ln Z\;\;\;\;,\;\;k\leq K}
7566:
1362:
938:
886:
692:
387:
382:
324:
3545:
1377:
article, the regression coefficients and explanatory variables are normally grouped into vectors of size
7403:{\displaystyle \delta _{j,y_{i}}={\begin{cases}1{\text{ for }}j=y_{i}\\0{\text{ otherwise}}\end{cases}}}
7041:
6793:
2367:
1318:{\displaystyle f(k,i)=\beta _{0,k}+\beta _{1,k}x_{1,i}+\beta _{2,k}x_{2,i}+\cdots +\beta _{M,k}x_{M,i},}
918:
theory, where observations represent people and outcomes represent choices, the score is considered the
475:
171:
513:
5284:) is greater than the utilities of all the other choices, i.e. if the utility associated with outcome
1491:
7578:
5249:
2410:
2355:
1051:
1047:
985:
959:
675:
639:
585:
495:
485:
366:
334:
289:
268:
176:
7354:
3727:
7594:
6796:, where the first parameter is unimportant. This is understandable since the first parameter is a
5289:
4992:
2402:
1374:
997:
977:
755:
743:
570:
413:
314:
309:
263:
212:
202:
147:
6999:
1331:
7795:
7599:
6797:
5292:, the probability of two having exactly the same value is 0, so we ignore the scenario. That is:
5000:
3695:
2390:
2370:
2359:
1688:
1064:
981:
770:
731:
719:
671:
635:
581:
518:
247:
232:
31:
3980:
2409:
of the probability of seeing a given output using the linear predictor as well as an additional
650:
Which major will a college student choose, given their grades, stated likes and dislikes, etc.?
7734:
7686:
7682:
7626:
3971:
964:
696:
304:
207:
161:
714:-1 independent binary choices, in which one alternative is chosen as a "pivot" and the other
7858:
7807:
7765:
7674:
7655:
3683:
3539:
2535:
2333:
The fact that we run multiple regressions reveals why the model relies on the assumption of
1114:
329:
258:
7843:
5218:
3656:
6919:
6903:
6804:
5027:
5015:
4996:
4005:
3687:
989:
915:
490:
197:
7547:{\displaystyle -\log L=-\sum _{i=1}^{n}\sum _{j=1}^{K}\delta _{j,y_{i}}\log(P(Y_{i}=j)).}
7034:
of the explained variables are considered as realizations of stochastically independent,
4037:
4011:
7754:
Baltas, G.; Doyle, P. (2001). "Random
Utility Models in Marketing Research: A Survey".
7659:
7299:
7279:
4995:
described for binary logistic regression. This formulation is common in the theory of
1092:
242:
17:
7769:
7414:
The negative log-likelihood function is therefore the well-known cross-entropy: :
4999:
models, and makes it easier to compare multinomial logistic regression to the related
7877:
7675:
3056:{\displaystyle Z=\sum _{k=1}^{K}e^{{\boldsymbol {\beta }}_{k}\cdot \mathbf {X} _{i}}}
2642:
659:
Which candidate will a person vote for, given particular demographic characteristics?
361:
237:
7844:"Dual coordinate descent methods for logistic regression and maximum entropy models"
727:
653:
Which blood type does a person have, given the results of various diagnostic tests?
227:
702:
If the multinomial logit is used to model choices, it relies on the assumption of
7710:
6906:(no single set of optimal coefficients) if the more general distribution is used.
946:
774:
591:
Multinomial logistic regression is known by a variety of other names, including
273:
222:
42:
1517:(a row vector) is the set of explanatory variables associated with observation
1108:
As in other forms of linear regression, multinomial logistic regression uses a
30:"Multinomial regression" redirects here. For the related Probit procedure, see
7863:
7812:
7782:
6931:
6115:
Let's look more closely at the first equation, which we can write as follows:
3066:
Note that this factor is "constant" in the sense that it is not a function of
934:
558:
2406:
1088:
1541:-1 outcomes are separately regressed against the pivot outcome. If outcome
2405:
can be directly extended to multi-way regression. That is, we model the
773:
with the explanatory variables (features) of a given observation using a
7731:
Conditioning diagnostics : collinearity and weak data in regression
1445:{\displaystyle f(k,i)={\boldsymbol {\beta }}_{k}\cdot \mathbf {x} _{i},}
6803:
The second parameter in an extreme-value or logistic distribution is a
5204:
1694:
If we exponentiate both sides and solve for the probabilities, we get:
919:
2386:
588:(which may be real-valued, binary-valued, categorical-valued, etc.).
7783:
Stata Manual “mlogit — Multinomial (polytomous) logistic regression”
7581:(MAP) estimation, must be learned using an iterative procedure; see
7565:, multinomial LR classifiers are commonly used as an alternative to
5288:
is the maximum of all the utilities. Since the latent variables are
958:
predictions, errors tend to multiply. For example, imagine a large
7718:. Sixth Conf. on Natural Language Learning (CoNLL). pp. 49–55.
7712:
A comparison of algorithms for maximum entropy parameter estimation
5189:{\displaystyle \varepsilon _{k}\sim \operatorname {EV} _{1}(0,1),}
1087:
The observed outcomes are different variants of a disease such as
734:
may be used in such cases as they allow for violation of the IIA.
7625:(Seventh ed.). Boston: Pearson Education. pp. 803–806.
4668:
choices, and examining how much better or worse all of the other
1381:, so that the predictor function can be written more compactly:
3645:{\displaystyle \operatorname {softmax} (k,x_{1},\ldots ,x_{n})}
1529:
To arrive at the multinomial logit model, one can imagine, for
5246:
is chosen if and only if the associated utility (the value of
3542:. The reason is that the effect of exponentiating the values
1484:
is the set of regression coefficients associated with outcome
872:
is the vector of explanatory variables describing observation
36:
3588:
is to exaggerate the differences between them. As a result,
930:. The predicted outcome is the one with the highest score.
7396:
3821:
3970:
The softmax function thus serves as the equivalent of the
1050:, predictor variables, features, etc.), and an associated
7087:
The likelihood function for this model is defined by:
6781:{\displaystyle X-Y\sim \operatorname {Logistic} (0,b).}
5003:
model, as well as to extend it to more complex models.
3086:, which we will need to determine through some sort of
769:
that constructs a score from a set of weights that are
7842:
Yu, Hsiang-Fu; Huang, Fang-Lan; Lin, Chih-Jen (2011).
6894:{\displaystyle bX\sim \operatorname {Logistic} (0,b).}
2561:
to ensure that the whole set of probabilities forms a
7796:"Generalized iterative scaling for log-linear models"
7420:
7322:
7302:
7282:
7096:
7044:
7002:
6952:
6857:
6813:
6741:
6690:
6639:
6124:
5936:
5301:
5252:
5221:
5141:
5039:
4806:
4681:
4632:
4069:
4040:
4014:
3983:
3847:
3706:
3659:
3594:
3548:
3413:
3267:
3102:
2991:
2776:
2654:
2574:
2538:
2426:
2165:
1840:
1703:
1558:
1494:
1461:
1390:
1334:
1163:
1117:
1008:
Specifically, it is assumed that we have a series of
906:) is the score associated with assigning observation
786:
7646:
Engel, J. (1988). "Polytomous logistic regression".
6844:{\displaystyle X\sim \operatorname {Logistic} (0,1)}
2401:The formulation of binary logistic regression as a
67:. Unsourced material may be challenged and removed.
7546:
7402:
7308:
7288:
7268:
7076:
7026:
6988:
6893:
6843:
6780:
6728:{\displaystyle Y\sim \operatorname {EV} _{1}(a,b)}
6727:
6677:{\displaystyle X\sim \operatorname {EV} _{1}(a,b)}
6676:
6618:
6104:
5916:
5276:
5234:
5188:
5124:
4968:
4786:
4656:
4615:
4052:
4026:
3996:
3959:
3827:
3672:
3644:
3580:
3527:
3393:
3246:
3093:The resulting equations for the probabilities are
3055:
2971:
2747:
2629:
2553:
2520:
2321:
2144:
1816:
1675:
1509:
1476:
1444:
1353:
1317:
1138:
852:
2156:We can use this to find the other probabilities:
634:Multinomial logistic regression is used when the
7582:
6792:extreme-value-distributed variables follows the
6499:
6355:
6255:
6161:
6129:
5973:
5967:
5937:
5742:
5710:
5536:
5504:
5338:
5306:
4807:
4034:separately specifiable probabilities, and hence
3848:
3838:Thus, we can write the probability equations as
3755:
3268:
3103:
2804:
2655:
2596:
2433:
2166:
2036:
1968:
1904:
1841:
1735:
1704:
1595:
1568:
1831:of the probabilities must sum to one, we find:
1545:(the last outcome) is chosen as the pivot, the
1067:, response variable), which can take on one of
7296:denotes the observations 1 to n and the index
5203:This latent variable can be thought of as the
761:The idea behind all of them, as in many other
27:Regression for more than two discrete outcomes
4657:{\displaystyle C=-{\boldsymbol {\beta }}_{K}}
2630:{\displaystyle \sum _{k=1}^{K}\Pr(Y_{i}=k)=1}
2532:As in the binary case, we need an extra term
538:
8:
1146:to predict the probability that observation
7573:of the random variables (commonly known as
2358:(MAP) estimation, which is an extension of
691:from each other (unlike, for example, in a
7557:Application in natural language processing
6092:
6091:
6087:
6086:
6085:
6084:
5966:
5962:
5112:
5111:
5107:
5106:
5105:
5104:
4956:
4955:
4951:
4950:
4949:
4948:
4741:
4737:
4736:
4735:
3807:
3741:
3733:
3234:
3233:
3229:
3228:
3227:
3226:
2903:
2899:
2833:
2829:
2735:
2734:
2730:
2729:
2728:
2727:
2508:
2507:
2503:
2502:
2501:
2500:
2309:
2308:
2304:
2303:
2302:
2301:
2035:
2034:
2030:
2029:
1994:
1804:
1803:
1799:
1798:
1797:
1796:
1761:
1663:
1662:
1658:
1657:
1656:
1655:
1525:As a set of independent binary regressions
1477:{\displaystyle {\boldsymbol {\beta }}_{k}}
545:
531:
138:
7862:
7811:
7794:Darroch, J.N. & Ratcliff, D. (1972).
7523:
7496:
7485:
7475:
7464:
7454:
7443:
7419:
7388:
7375:
7360:
7349:
7338:
7327:
7321:
7301:
7281:
7248:
7237:
7232:
7216:
7200:
7189:
7174:
7163:
7147:
7134:
7118:
7107:
7095:
7068:
7049:
7043:
7001:
6966:
6957:
6951:
6856:
6812:
6740:
6701:
6689:
6650:
6638:
6573:
6560:
6547:
6542:
6529:
6524:
6514:
6509:
6441:
6428:
6423:
6413:
6408:
6395:
6382:
6377:
6367:
6362:
6300:
6289:
6276:
6265:
6206:
6195:
6182:
6171:
6139:
6125:
6123:
6075:
6064:
6048:
6037:
6018:
6007:
5994:
5983:
5947:
5935:
5901:
5884:
5871:
5860:
5851:
5843:
5837:
5826:
5813:
5802:
5793:
5787:
5776:
5763:
5752:
5720:
5689:
5678:
5665:
5654:
5645:
5637:
5631:
5620:
5607:
5596:
5587:
5581:
5570:
5557:
5546:
5514:
5491:
5480:
5467:
5456:
5447:
5439:
5433:
5422:
5409:
5398:
5389:
5383:
5372:
5359:
5348:
5316:
5302:
5300:
5268:
5257:
5251:
5226:
5220:
5159:
5146:
5140:
5098:
5085:
5080:
5070:
5065:
5055:
5044:
5038:
4937:
4932:
4919:
4914:
4912:
4896:
4885:
4866:
4861:
4848:
4843:
4841:
4835:
4817:
4805:
4761:
4756:
4729:
4724:
4714:
4709:
4692:
4687:
4682:
4680:
4648:
4643:
4631:
4598:
4593:
4583:
4578:
4576:
4566:
4555:
4542:
4537:
4527:
4522:
4520:
4514:
4493:
4488:
4478:
4473:
4471:
4461:
4450:
4438:
4433:
4425:
4411:
4406:
4396:
4391:
4389:
4377:
4372:
4364:
4357:
4336:
4331:
4323:
4311:
4306:
4296:
4291:
4289:
4279:
4268:
4254:
4249:
4241:
4229:
4224:
4214:
4209:
4207:
4200:
4182:
4177:
4158:
4153:
4148:
4138:
4127:
4114:
4109:
4090:
4085:
4080:
4074:
4070:
4068:
4039:
4013:
3988:
3982:
3948:
3943:
3933:
3928:
3912:
3907:
3897:
3892:
3858:
3846:
3809:
3808:
3788:
3769:
3748:
3735:
3734:
3722:
3705:
3664:
3658:
3633:
3614:
3593:
3572:
3553:
3547:
3514:
3509:
3499:
3488:
3475:
3470:
3464:
3452:
3433:
3412:
3380:
3375:
3365:
3360:
3358:
3348:
3337:
3324:
3319:
3309:
3304:
3302:
3296:
3278:
3266:
3215:
3210:
3200:
3195:
3193:
3183:
3172:
3159:
3154:
3144:
3139:
3137:
3131:
3113:
3101:
3045:
3040:
3030:
3025:
3023:
3013:
3002:
2990:
2961:
2956:
2946:
2941:
2939:
2929:
2918:
2904:
2891:
2886:
2876:
2871:
2869:
2855:
2849:
2838:
2814:
2798:
2787:
2775:
2719:
2714:
2704:
2699:
2697:
2683:
2665:
2653:
2606:
2590:
2579:
2573:
2537:
2482:
2477:
2467:
2462:
2443:
2425:
2290:
2285:
2275:
2270:
2268:
2252:
2241:
2222:
2217:
2207:
2202:
2200:
2194:
2176:
2164:
2131:
2126:
2116:
2111:
2109:
2093:
2082:
2066:
2065:
2061:
2046:
2021:
2016:
2006:
2001:
1999:
1978:
1967:
1955:
1944:
1933:
1929:
1914:
1892:
1881:
1870:
1866:
1851:
1839:
1788:
1783:
1773:
1768:
1766:
1745:
1734:
1733:
1729:
1714:
1702:
1649:
1644:
1634:
1629:
1627:
1623:
1605:
1578:
1565:
1557:
1501:
1496:
1493:
1468:
1463:
1460:
1433:
1428:
1418:
1413:
1389:
1339:
1333:
1300:
1284:
1259:
1243:
1224:
1208:
1189:
1162:
1116:
841:
836:
826:
821:
802:
797:
785:
127:Learn how and when to remove this message
7829:Pattern Recognition and Machine Learning
7704:
7702:
6629:There are a few things to realize here:
3652:will return a value close to 0 whenever
7610:
6525:
6510:
6409:
6363:
5066:
4915:
4844:
4797:This leads to the following equations:
4757:
4725:
4710:
4688:
4644:
4626:As a result, it is conventional to set
4579:
4523:
4474:
4392:
4292:
4210:
4154:
4086:
3929:
3893:
3361:
3305:
3196:
3140:
3026:
2942:
2872:
2700:
2463:
2335:independence of irrelevant alternatives
2271:
2203:
2112:
2002:
1769:
1630:
1464:
1414:
1012:observed data points. Each data point
822:
710:alternatives to be modeled as a set of
704:independence of irrelevant alternatives
467:
353:
153:
146:
6989:{\displaystyle y_{i}\in {0,1,\dots K}}
2345:The unknown parameters in each vector
1687:This formulation is also known as the
7800:The Annals of Mathematical Statistics
4004:vectors of coefficients are uniquely
980:, the only difference being that the
670:problems. They all have in common a
7:
7677:Applied Logistic Regression Analysis
4983:-1 independent two-way regressions.
2565:, i.e. so that they all sum to one:
2379:iteratively reweighted least squares
65:adding citations to reliable sources
6790:independent identically distributed
3694:, etc.) and which approximates the
3581:{\displaystyle x_{1},\ldots ,x_{n}}
2354:are typically jointly estimated by
7660:10.1111/j.1467-9574.1988.tb01238.x
7077:{\displaystyle Y_{1},\dots ,Y_{n}}
6582:
6459:
6315:
6215:
5030:) that is distributed as follows:
5006:Imagine that, for each data point
976:The basic setup is the same as in
25:
1373:th outcome. As explained in the
624:conditional maximum entropy model
76:"Multinomial logistic regression"
6543:
6424:
6378:
5081:
4933:
4862:
4594:
4538:
4489:
4434:
4407:
4373:
4332:
4307:
4250:
4225:
4178:
4110:
3944:
3908:
3376:
3320:
3211:
3155:
3041:
2957:
2887:
2715:
2478:
2286:
2218:
2127:
2017:
1784:
1645:
1510:{\displaystyle \mathbf {x} _{i}}
1497:
1429:
1369:th explanatory variable and the
837:
798:
512:
41:
7827:Bishop, Christopher M. (2006).
6788:That is, the difference of two
5277:{\displaystyle Y_{i,k}^{\ast }}
3974:in binary logistic regression.
563:multinomial logistic regression
460:Least-squares spectral analysis
398:Generalized estimating equation
218:Multinomial logistic regression
193:Vector generalized linear model
52:needs additional citations for
7538:
7535:
7516:
7510:
7229:
7209:
7153:
7127:
6885:
6873:
6838:
6826:
6772:
6760:
6722:
6710:
6671:
6659:
6609:
6535:
6505:
6502:
6486:
6447:
6404:
6358:
6342:
6258:
6242:
6164:
6151:
6132:
6081:
6054:
5976:
5970:
5959:
5940:
5907:
5745:
5732:
5713:
5695:
5539:
5526:
5507:
5497:
5341:
5328:
5309:
5180:
5168:
4829:
4810:
4170:
4149:
4102:
4081:
3954:
3882:
3870:
3851:
3794:
3762:
3716:
3710:
3639:
3601:
3458:
3420:
3290:
3271:
3125:
3106:
2826:
2807:
2677:
2658:
2618:
2599:
2455:
2436:
2188:
2169:
2058:
2039:
2031:
1990:
1971:
1926:
1907:
1863:
1844:
1757:
1738:
1726:
1707:
1617:
1598:
1590:
1571:
1406:
1394:
1179:
1167:
1133:
1121:
814:
793:
765:techniques, is to construct a
1:
7831:. Springer. pp. 206–209.
7770:10.1016/S0148-2963(99)00058-2
4993:two-way latent variable model
2375:generalized iterative scaling
1549:-1 regression equations are:
279:Nonlinear mixed-effects model
7757:Journal of Business Research
7583:#Estimating the coefficients
7316:denotes the classes 1 to K.
7027:{\displaystyle i=1,\dots ,n}
1354:{\displaystyle \beta _{m,k}}
943:linear discriminant analysis
7569:because they do not assume
7563:natural language processing
5207:associated with data point
3690:(which can be conveniently
2383:gradient-based optimization
2341:Estimating the coefficients
1533:possible outcomes, running
889:) corresponding to outcome
885:is a vector of weights (or
481:Mean and predicted response
7910:
5198:extreme value distribution
4987:As a latent-variable model
3997:{\displaystyle \beta _{k}}
763:statistical classification
741:
668:statistical classification
274:Linear mixed-effects model
29:
7889:Classification algorithms
7864:10.1007/s10994-010-5221-8
7036:categorically distributed
3977:Note that not all of the
1154:, of the following form:
1110:linear predictor function
767:linear predictor function
689:statistically independent
579:categorically distributed
440:Least absolute deviations
7571:statistical independence
5014:, there is a continuous
3404:The following function:
2563:probability distribution
1827:Using the fact that all
569:method that generalizes
188:Generalized linear model
7813:10.1214/aoms/1177692379
7729:Belsley, David (1991).
7709:Malouf, Robert (2002).
7567:naive Bayes classifiers
7410:is the Kronecker delta.
6926:Estimation of intercept
5196:i.e. a standard type-1
2413:, the logarithm of the
1024:) consists of a set of
939:support vector machines
922:associated with person
887:regression coefficients
18:Multinomial logit model
7673:Menard, Scott (2002).
7648:Statistica Neerlandica
7548:
7480:
7459:
7404:
7310:
7290:
7270:
7205:
7179:
7123:
7078:
7028:
6990:
6895:
6845:
6782:
6729:
6678:
6620:
6106:
5918:
5278:
5236:
5190:
5126:
4970:
4907:
4788:
4658:
4617:
4571:
4466:
4284:
4143:
4054:
4028:
3998:
3961:
3829:
3674:
3646:
3582:
3538:is referred to as the
3529:
3504:
3395:
3353:
3248:
3188:
3057:
3018:
2973:
2934:
2854:
2803:
2749:
2631:
2595:
2555:
2554:{\displaystyle -\ln Z}
2522:
2323:
2263:
2146:
2104:
1966:
1903:
1818:
1677:
1511:
1478:
1446:
1363:regression coefficient
1355:
1319:
1140:
1139:{\displaystyle f(k,i)}
1028:explanatory variables
854:
693:naive Bayes classifier
622:) classifier, and the
519:Mathematics portal
445:Iteratively reweighted
7549:
7460:
7439:
7405:
7311:
7291:
7271:
7185:
7159:
7103:
7079:
7029:
6991:
6896:
6846:
6794:logistic distribution
6783:
6730:
6679:
6621:
6107:
5919:
5279:
5237:
5235:{\displaystyle Y_{i}}
5191:
5127:
5010:and possible outcome
4971:
4881:
4789:
4659:
4618:
4551:
4446:
4264:
4123:
4055:
4029:
3999:
3962:
3830:
3675:
3673:{\displaystyle x_{k}}
3647:
3583:
3530:
3484:
3396:
3333:
3249:
3168:
3058:
2998:
2974:
2914:
2834:
2783:
2750:
2632:
2575:
2556:
2523:
2397:As a log-linear model
2324:
2237:
2147:
2078:
1940:
1877:
1819:
1678:
1512:
1479:
1447:
1356:
1320:
1141:
1048:independent variables
855:
676:independent variables
586:independent variables
476:Regression validation
455:Bayesian multivariate
172:Polynomial regression
7623:Econometric Analysis
7579:maximum a posteriori
7418:
7320:
7300:
7280:
7094:
7042:
7000:
6950:
6946:The observed values
6855:
6811:
6739:
6688:
6637:
6122:
5934:
5299:
5250:
5219:
5139:
5037:
5026:(i.e. an unobserved
4804:
4679:
4630:
4067:
4038:
4012:
3981:
3845:
3704:
3657:
3592:
3546:
3411:
3265:
3100:
2989:
2774:
2652:
2572:
2536:
2424:
2411:normalization factor
2389:, or by specialized
2381:(IRLS), by means of
2356:maximum a posteriori
2163:
1838:
1701:
1556:
1492:
1459:
1388:
1365:associated with the
1332:
1161:
1115:
784:
501:Gauss–Markov theorem
496:Studentized residual
486:Errors and residuals
320:Principal components
290:Nonlinear regression
177:General linear model
61:improve this article
7884:Logistic regression
7733:. New York: Wiley.
7595:Logistic regression
6942:Likelihood function
6305:
6281:
6211:
6187:
6080:
6053:
6023:
5999:
5906:
5876:
5842:
5818:
5792:
5768:
5694:
5670:
5636:
5612:
5586:
5562:
5496:
5472:
5438:
5414:
5388:
5364:
5273:
5060:
4927:
4856:
4769:
4700:
4053:{\displaystyle k-1}
4027:{\displaystyle k-1}
2385:algorithms such as
1375:logistic regression
998:logistic regression
982:dependent variables
978:logistic regression
756:logistic regression
744:Logistic regression
575:multiclass problems
571:logistic regression
346:Errors-in-variables
213:Logistic regression
203:Binomial regression
148:Regression analysis
142:Part of a series on
7619:Greene, William H.
7600:Multinomial probit
7544:
7400:
7395:
7306:
7286:
7266:
7074:
7024:
6986:
6891:
6841:
6798:location parameter
6778:
6725:
6674:
6616:
6614:
6285:
6261:
6191:
6167:
6102:
6060:
6033:
6003:
5979:
5914:
5912:
5880:
5856:
5822:
5798:
5772:
5748:
5674:
5650:
5616:
5592:
5566:
5542:
5476:
5452:
5418:
5394:
5368:
5344:
5274:
5253:
5232:
5186:
5122:
5040:
5001:multinomial probit
4966:
4913:
4842:
4784:
4782:
4755:
4686:
4654:
4613:
4611:
4050:
4024:
3994:
3957:
3825:
3820:
3696:indicator function
3686:that behaves as a
3670:
3642:
3578:
3525:
3391:
3244:
3053:
2969:
2765:partition function
2745:
2627:
2551:
2518:
2415:partition function
2391:coordinate descent
2371:prior distribution
2360:maximum likelihood
2319:
2142:
1814:
1689:Additive Log Ratio
1673:
1507:
1474:
1442:
1351:
1315:
1136:
1065:dependent variable
850:
732:multinomial probit
720:perfect substitute
672:dependent variable
636:dependent variable
582:dependent variable
233:Multinomial probit
32:Multinomial probit
7894:Regression models
7632:978-0-273-75356-8
7391:
7363:
7309:{\displaystyle j}
7289:{\displaystyle i}
7038:random variables
6587:
6581:
6464:
6458:
6320:
6314:
6220:
6214:
5927:Or equivalently:
5854:
5846:
5796:
5648:
5640:
5590:
5450:
5442:
5392:
5211:choosing outcome
4946:
4607:
4502:
4345:
4191:
3972:logistic function
3812:
3738:
3523:
3389:
3224:
2912:
2863:
2691:
2337:described above.
2299:
2140:
1621:
992:, i.e. there are
965:error propagation
953:choosing outcome
926:choosing outcome
771:linearly combined
608:multinomial logit
584:, given a set of
555:
554:
208:Binary regression
167:Simple regression
162:Linear regression
137:
136:
129:
111:
16:(Redirected from
7901:
7869:
7868:
7866:
7851:Machine Learning
7848:
7839:
7833:
7832:
7824:
7818:
7817:
7815:
7806:(5): 1470–1480.
7791:
7785:
7780:
7774:
7773:
7751:
7745:
7744:
7726:
7720:
7719:
7717:
7706:
7697:
7696:
7681:. SAGE. p.
7680:
7670:
7664:
7663:
7643:
7637:
7636:
7615:
7553:
7551:
7550:
7545:
7528:
7527:
7503:
7502:
7501:
7500:
7479:
7474:
7458:
7453:
7409:
7407:
7406:
7401:
7399:
7398:
7392:
7389:
7380:
7379:
7364:
7361:
7345:
7344:
7343:
7342:
7315:
7313:
7312:
7307:
7295:
7293:
7292:
7287:
7276:where the index
7275:
7273:
7272:
7267:
7262:
7258:
7257:
7256:
7255:
7254:
7253:
7252:
7221:
7220:
7204:
7199:
7178:
7173:
7152:
7151:
7139:
7138:
7122:
7117:
7083:
7081:
7080:
7075:
7073:
7072:
7054:
7053:
7033:
7031:
7030:
7025:
6995:
6993:
6992:
6987:
6985:
6962:
6961:
6900:
6898:
6897:
6892:
6850:
6848:
6847:
6842:
6787:
6785:
6784:
6779:
6734:
6732:
6731:
6726:
6706:
6705:
6683:
6681:
6680:
6675:
6655:
6654:
6625:
6623:
6622:
6617:
6615:
6585:
6579:
6578:
6577:
6565:
6564:
6552:
6551:
6546:
6534:
6533:
6528:
6519:
6518:
6513:
6492:
6462:
6456:
6446:
6445:
6433:
6432:
6427:
6418:
6417:
6412:
6400:
6399:
6387:
6386:
6381:
6372:
6371:
6366:
6348:
6318:
6312:
6304:
6299:
6280:
6275:
6248:
6218:
6212:
6210:
6205:
6186:
6181:
6144:
6143:
6111:
6109:
6108:
6103:
6079:
6074:
6052:
6047:
6022:
6017:
5998:
5993:
5952:
5951:
5923:
5921:
5920:
5915:
5913:
5905:
5900:
5875:
5870:
5855:
5852:
5847:
5844:
5841:
5836:
5817:
5812:
5797:
5794:
5791:
5786:
5767:
5762:
5725:
5724:
5706:
5693:
5688:
5669:
5664:
5649:
5646:
5641:
5638:
5635:
5630:
5611:
5606:
5591:
5588:
5585:
5580:
5561:
5556:
5519:
5518:
5495:
5490:
5471:
5466:
5451:
5448:
5443:
5440:
5437:
5432:
5413:
5408:
5393:
5390:
5387:
5382:
5363:
5358:
5321:
5320:
5283:
5281:
5280:
5275:
5272:
5267:
5241:
5239:
5238:
5233:
5231:
5230:
5195:
5193:
5192:
5187:
5164:
5163:
5151:
5150:
5131:
5129:
5128:
5123:
5103:
5102:
5090:
5089:
5084:
5075:
5074:
5069:
5059:
5054:
4975:
4973:
4972:
4967:
4947:
4945:
4944:
4943:
4942:
4941:
4936:
4923:
4918:
4906:
4895:
4873:
4872:
4871:
4870:
4865:
4852:
4847:
4836:
4822:
4821:
4793:
4791:
4790:
4785:
4783:
4765:
4760:
4734:
4733:
4728:
4719:
4718:
4713:
4696:
4691:
4663:
4661:
4660:
4655:
4653:
4652:
4647:
4622:
4620:
4619:
4614:
4612:
4608:
4606:
4605:
4604:
4603:
4602:
4597:
4588:
4587:
4582:
4570:
4565:
4549:
4548:
4547:
4546:
4541:
4532:
4531:
4526:
4515:
4507:
4503:
4501:
4500:
4499:
4498:
4497:
4492:
4483:
4482:
4477:
4465:
4460:
4445:
4444:
4443:
4442:
4437:
4419:
4418:
4417:
4416:
4415:
4410:
4401:
4400:
4395:
4384:
4383:
4382:
4381:
4376:
4358:
4350:
4346:
4344:
4343:
4342:
4341:
4340:
4335:
4318:
4317:
4316:
4315:
4310:
4301:
4300:
4295:
4283:
4278:
4262:
4261:
4260:
4259:
4258:
4253:
4236:
4235:
4234:
4233:
4228:
4219:
4218:
4213:
4201:
4192:
4190:
4189:
4188:
4187:
4186:
4181:
4163:
4162:
4157:
4142:
4137:
4121:
4120:
4119:
4118:
4113:
4095:
4094:
4089:
4075:
4059:
4057:
4056:
4051:
4033:
4031:
4030:
4025:
4003:
4001:
4000:
3995:
3993:
3992:
3966:
3964:
3963:
3958:
3953:
3952:
3947:
3938:
3937:
3932:
3917:
3916:
3911:
3902:
3901:
3896:
3863:
3862:
3834:
3832:
3831:
3826:
3824:
3823:
3814:
3813:
3810:
3793:
3792:
3774:
3773:
3758:
3740:
3739:
3736:
3684:weighted average
3679:
3677:
3676:
3671:
3669:
3668:
3651:
3649:
3648:
3643:
3638:
3637:
3619:
3618:
3587:
3585:
3584:
3579:
3577:
3576:
3558:
3557:
3540:softmax function
3534:
3532:
3531:
3526:
3524:
3522:
3521:
3520:
3519:
3518:
3503:
3498:
3482:
3481:
3480:
3479:
3465:
3457:
3456:
3438:
3437:
3400:
3398:
3397:
3392:
3390:
3388:
3387:
3386:
3385:
3384:
3379:
3370:
3369:
3364:
3352:
3347:
3331:
3330:
3329:
3328:
3323:
3314:
3313:
3308:
3297:
3283:
3282:
3253:
3251:
3250:
3245:
3225:
3223:
3222:
3221:
3220:
3219:
3214:
3205:
3204:
3199:
3187:
3182:
3166:
3165:
3164:
3163:
3158:
3149:
3148:
3143:
3132:
3118:
3117:
3062:
3060:
3059:
3054:
3052:
3051:
3050:
3049:
3044:
3035:
3034:
3029:
3017:
3012:
2978:
2976:
2975:
2970:
2968:
2967:
2966:
2965:
2960:
2951:
2950:
2945:
2933:
2928:
2913:
2905:
2898:
2897:
2896:
2895:
2890:
2881:
2880:
2875:
2864:
2856:
2853:
2848:
2819:
2818:
2802:
2797:
2754:
2752:
2751:
2746:
2726:
2725:
2724:
2723:
2718:
2709:
2708:
2703:
2692:
2684:
2670:
2669:
2636:
2634:
2633:
2628:
2611:
2610:
2594:
2589:
2560:
2558:
2557:
2552:
2527:
2525:
2524:
2519:
2487:
2486:
2481:
2472:
2471:
2466:
2448:
2447:
2403:log-linear model
2328:
2326:
2325:
2320:
2300:
2298:
2297:
2296:
2295:
2294:
2289:
2280:
2279:
2274:
2262:
2251:
2229:
2228:
2227:
2226:
2221:
2212:
2211:
2206:
2195:
2181:
2180:
2151:
2149:
2148:
2143:
2141:
2139:
2138:
2137:
2136:
2135:
2130:
2121:
2120:
2115:
2103:
2092:
2067:
2051:
2050:
2028:
2027:
2026:
2025:
2020:
2011:
2010:
2005:
1993:
1983:
1982:
1965:
1954:
1919:
1918:
1902:
1891:
1856:
1855:
1823:
1821:
1820:
1815:
1795:
1794:
1793:
1792:
1787:
1778:
1777:
1772:
1760:
1750:
1749:
1719:
1718:
1682:
1680:
1679:
1674:
1654:
1653:
1648:
1639:
1638:
1633:
1622:
1620:
1610:
1609:
1593:
1583:
1582:
1566:
1516:
1514:
1513:
1508:
1506:
1505:
1500:
1483:
1481:
1480:
1475:
1473:
1472:
1467:
1451:
1449:
1448:
1443:
1438:
1437:
1432:
1423:
1422:
1417:
1360:
1358:
1357:
1352:
1350:
1349:
1324:
1322:
1321:
1316:
1311:
1310:
1295:
1294:
1270:
1269:
1254:
1253:
1235:
1234:
1219:
1218:
1200:
1199:
1145:
1143:
1142:
1137:
1104:Linear predictor
1099:characteristics.
960:predictive model
859:
857:
856:
851:
846:
845:
840:
831:
830:
825:
807:
806:
801:
722:for a blue bus.
547:
540:
533:
517:
516:
424:Ridge regression
259:Multilevel model
139:
132:
125:
121:
118:
112:
110:
69:
45:
37:
21:
7909:
7908:
7904:
7903:
7902:
7900:
7899:
7898:
7874:
7873:
7872:
7846:
7841:
7840:
7836:
7826:
7825:
7821:
7793:
7792:
7788:
7781:
7777:
7753:
7752:
7748:
7741:
7728:
7727:
7723:
7715:
7708:
7707:
7700:
7693:
7672:
7671:
7667:
7645:
7644:
7640:
7633:
7617:
7616:
7612:
7608:
7591:
7559:
7519:
7492:
7481:
7416:
7415:
7394:
7393:
7390: otherwise
7382:
7381:
7371:
7362: for
7350:
7334:
7323:
7318:
7317:
7298:
7297:
7278:
7277:
7244:
7233:
7228:
7212:
7184:
7180:
7143:
7130:
7092:
7091:
7064:
7045:
7040:
7039:
6998:
6997:
6953:
6948:
6947:
6944:
6938:
6928:
6920:order statistic
6904:nonidentifiable
6853:
6852:
6809:
6808:
6807:, such that if
6805:scale parameter
6737:
6736:
6697:
6686:
6685:
6646:
6635:
6634:
6633:In general, if
6613:
6612:
6569:
6556:
6541:
6523:
6508:
6490:
6489:
6437:
6422:
6407:
6391:
6376:
6361:
6346:
6345:
6246:
6245:
6154:
6135:
6120:
6119:
5943:
5932:
5931:
5911:
5910:
5853: and
5845: and
5795: and
5735:
5716:
5707:
5705:
5699:
5698:
5647: and
5639: and
5589: and
5529:
5510:
5501:
5500:
5449: and
5441: and
5391: and
5331:
5312:
5297:
5296:
5248:
5247:
5222:
5217:
5216:
5155:
5142:
5137:
5136:
5094:
5079:
5064:
5035:
5034:
5028:random variable
5025:
5016:latent variable
4997:discrete choice
4989:
4931:
4908:
4874:
4860:
4837:
4813:
4802:
4801:
4781:
4780:
4770:
4752:
4751:
4723:
4708:
4701:
4677:
4676:
4642:
4628:
4627:
4610:
4609:
4592:
4577:
4572:
4550:
4536:
4521:
4516:
4505:
4504:
4487:
4472:
4467:
4432:
4421:
4420:
4405:
4390:
4385:
4371:
4360:
4359:
4348:
4347:
4330:
4319:
4305:
4290:
4285:
4263:
4248:
4237:
4223:
4208:
4203:
4202:
4193:
4176:
4152:
4144:
4122:
4108:
4084:
4076:
4065:
4064:
4036:
4035:
4010:
4009:
3984:
3979:
3978:
3942:
3927:
3906:
3891:
3854:
3843:
3842:
3819:
3818:
3801:
3800:
3784:
3765:
3723:
3702:
3701:
3688:smooth function
3660:
3655:
3654:
3629:
3610:
3590:
3589:
3568:
3549:
3544:
3543:
3510:
3505:
3483:
3471:
3466:
3448:
3429:
3409:
3408:
3374:
3359:
3354:
3332:
3318:
3303:
3298:
3274:
3263:
3262:
3209:
3194:
3189:
3167:
3153:
3138:
3133:
3109:
3098:
3097:
3085:
3074:
3039:
3024:
3019:
2987:
2986:
2955:
2940:
2935:
2885:
2870:
2865:
2810:
2772:
2771:
2713:
2698:
2693:
2661:
2650:
2649:
2602:
2570:
2569:
2534:
2533:
2476:
2461:
2439:
2422:
2421:
2399:
2352:
2343:
2284:
2269:
2264:
2230:
2216:
2201:
2196:
2172:
2161:
2160:
2125:
2110:
2105:
2071:
2042:
2015:
2000:
1995:
1974:
1910:
1847:
1836:
1835:
1782:
1767:
1762:
1741:
1710:
1699:
1698:
1643:
1628:
1601:
1594:
1574:
1567:
1554:
1553:
1527:
1495:
1490:
1489:
1462:
1457:
1456:
1427:
1412:
1386:
1385:
1335:
1330:
1329:
1296:
1280:
1255:
1239:
1220:
1204:
1185:
1159:
1158:
1113:
1112:
1106:
1083:Some examples:
1063:(also known as
1062:
1046:(also known as
1045:
1036:
1006:
974:
949:of observation
916:discrete choice
901:
884:
871:
835:
820:
796:
782:
781:
751:
746:
740:
684:
638:in question is
632:
616:maximum entropy
551:
511:
491:Goodness of fit
198:Discrete choice
133:
122:
116:
113:
70:
68:
58:
46:
35:
28:
23:
22:
15:
12:
11:
5:
7907:
7905:
7897:
7896:
7891:
7886:
7876:
7875:
7871:
7870:
7857:(1–2): 41–75.
7834:
7819:
7786:
7775:
7764:(2): 115–125.
7746:
7739:
7721:
7698:
7691:
7665:
7654:(4): 233–252.
7638:
7631:
7609:
7607:
7604:
7603:
7602:
7597:
7590:
7587:
7558:
7555:
7543:
7540:
7537:
7534:
7531:
7526:
7522:
7518:
7515:
7512:
7509:
7506:
7499:
7495:
7491:
7488:
7484:
7478:
7473:
7470:
7467:
7463:
7457:
7452:
7449:
7446:
7442:
7438:
7435:
7432:
7429:
7426:
7423:
7412:
7411:
7397:
7387:
7384:
7383:
7378:
7374:
7370:
7367:
7359:
7356:
7355:
7353:
7348:
7341:
7337:
7333:
7330:
7326:
7305:
7285:
7265:
7261:
7251:
7247:
7243:
7240:
7236:
7231:
7227:
7224:
7219:
7215:
7211:
7208:
7203:
7198:
7195:
7192:
7188:
7183:
7177:
7172:
7169:
7166:
7162:
7158:
7155:
7150:
7146:
7142:
7137:
7133:
7129:
7126:
7121:
7116:
7113:
7110:
7106:
7102:
7099:
7071:
7067:
7063:
7060:
7057:
7052:
7048:
7023:
7020:
7017:
7014:
7011:
7008:
7005:
6984:
6981:
6978:
6975:
6972:
6969:
6965:
6960:
6956:
6943:
6940:
6927:
6924:
6916:
6915:
6907:
6890:
6887:
6884:
6881:
6878:
6875:
6872:
6869:
6866:
6863:
6860:
6840:
6837:
6834:
6831:
6828:
6825:
6822:
6819:
6816:
6801:
6777:
6774:
6771:
6768:
6765:
6762:
6759:
6756:
6753:
6750:
6747:
6744:
6724:
6721:
6718:
6715:
6712:
6709:
6704:
6700:
6696:
6693:
6673:
6670:
6667:
6664:
6661:
6658:
6653:
6649:
6645:
6642:
6627:
6626:
6611:
6608:
6605:
6602:
6599:
6596:
6593:
6590:
6584:
6576:
6572:
6568:
6563:
6559:
6555:
6550:
6545:
6540:
6537:
6532:
6527:
6522:
6517:
6512:
6507:
6504:
6501:
6498:
6495:
6493:
6491:
6488:
6485:
6482:
6479:
6476:
6473:
6470:
6467:
6461:
6455:
6452:
6449:
6444:
6440:
6436:
6431:
6426:
6421:
6416:
6411:
6406:
6403:
6398:
6394:
6390:
6385:
6380:
6375:
6370:
6365:
6360:
6357:
6354:
6351:
6349:
6347:
6344:
6341:
6338:
6335:
6332:
6329:
6326:
6323:
6317:
6311:
6308:
6303:
6298:
6295:
6292:
6288:
6284:
6279:
6274:
6271:
6268:
6264:
6260:
6257:
6254:
6251:
6249:
6247:
6244:
6241:
6238:
6235:
6232:
6229:
6226:
6223:
6217:
6209:
6204:
6201:
6198:
6194:
6190:
6185:
6180:
6177:
6174:
6170:
6166:
6163:
6160:
6157:
6155:
6153:
6150:
6147:
6142:
6138:
6134:
6131:
6128:
6127:
6113:
6112:
6101:
6098:
6095:
6090:
6083:
6078:
6073:
6070:
6067:
6063:
6059:
6056:
6051:
6046:
6043:
6040:
6036:
6032:
6029:
6026:
6021:
6016:
6013:
6010:
6006:
6002:
5997:
5992:
5989:
5986:
5982:
5978:
5975:
5972:
5969:
5965:
5961:
5958:
5955:
5950:
5946:
5942:
5939:
5925:
5924:
5909:
5904:
5899:
5896:
5893:
5890:
5887:
5883:
5879:
5874:
5869:
5866:
5863:
5859:
5850:
5840:
5835:
5832:
5829:
5825:
5821:
5816:
5811:
5808:
5805:
5801:
5790:
5785:
5782:
5779:
5775:
5771:
5766:
5761:
5758:
5755:
5751:
5747:
5744:
5741:
5738:
5736:
5734:
5731:
5728:
5723:
5719:
5715:
5712:
5709:
5708:
5704:
5701:
5700:
5697:
5692:
5687:
5684:
5681:
5677:
5673:
5668:
5663:
5660:
5657:
5653:
5644:
5634:
5629:
5626:
5623:
5619:
5615:
5610:
5605:
5602:
5599:
5595:
5584:
5579:
5576:
5573:
5569:
5565:
5560:
5555:
5552:
5549:
5545:
5541:
5538:
5535:
5532:
5530:
5528:
5525:
5522:
5517:
5513:
5509:
5506:
5503:
5502:
5499:
5494:
5489:
5486:
5483:
5479:
5475:
5470:
5465:
5462:
5459:
5455:
5446:
5436:
5431:
5428:
5425:
5421:
5417:
5412:
5407:
5404:
5401:
5397:
5386:
5381:
5378:
5375:
5371:
5367:
5362:
5357:
5354:
5351:
5347:
5343:
5340:
5337:
5334:
5332:
5330:
5327:
5324:
5319:
5315:
5311:
5308:
5305:
5304:
5271:
5266:
5263:
5260:
5256:
5229:
5225:
5185:
5182:
5179:
5176:
5173:
5170:
5167:
5162:
5158:
5154:
5149:
5145:
5133:
5132:
5121:
5118:
5115:
5110:
5101:
5097:
5093:
5088:
5083:
5078:
5073:
5068:
5063:
5058:
5053:
5050:
5047:
5043:
5021:
4988:
4985:
4977:
4976:
4965:
4962:
4959:
4954:
4940:
4935:
4930:
4926:
4922:
4917:
4911:
4905:
4902:
4899:
4894:
4891:
4888:
4884:
4880:
4877:
4869:
4864:
4859:
4855:
4851:
4846:
4840:
4834:
4831:
4828:
4825:
4820:
4816:
4812:
4809:
4795:
4794:
4779:
4776:
4773:
4771:
4768:
4764:
4759:
4754:
4753:
4750:
4747:
4744:
4740:
4732:
4727:
4722:
4717:
4712:
4707:
4704:
4702:
4699:
4695:
4690:
4685:
4684:
4651:
4646:
4641:
4638:
4635:
4624:
4623:
4601:
4596:
4591:
4586:
4581:
4575:
4569:
4564:
4561:
4558:
4554:
4545:
4540:
4535:
4530:
4525:
4519:
4513:
4510:
4508:
4506:
4496:
4491:
4486:
4481:
4476:
4470:
4464:
4459:
4456:
4453:
4449:
4441:
4436:
4431:
4428:
4424:
4414:
4409:
4404:
4399:
4394:
4388:
4380:
4375:
4370:
4367:
4363:
4356:
4353:
4351:
4349:
4339:
4334:
4329:
4326:
4322:
4314:
4309:
4304:
4299:
4294:
4288:
4282:
4277:
4274:
4271:
4267:
4257:
4252:
4247:
4244:
4240:
4232:
4227:
4222:
4217:
4212:
4206:
4199:
4196:
4194:
4185:
4180:
4175:
4172:
4169:
4166:
4161:
4156:
4151:
4147:
4141:
4136:
4133:
4130:
4126:
4117:
4112:
4107:
4104:
4101:
4098:
4093:
4088:
4083:
4079:
4073:
4072:
4049:
4046:
4043:
4023:
4020:
4017:
3991:
3987:
3968:
3967:
3956:
3951:
3946:
3941:
3936:
3931:
3926:
3923:
3920:
3915:
3910:
3905:
3900:
3895:
3890:
3887:
3884:
3881:
3878:
3875:
3872:
3869:
3866:
3861:
3857:
3853:
3850:
3836:
3835:
3822:
3817:
3806:
3803:
3802:
3799:
3796:
3791:
3787:
3783:
3780:
3777:
3772:
3768:
3764:
3761:
3757:
3754:
3751:
3747:
3744:
3732:
3729:
3728:
3726:
3721:
3718:
3715:
3712:
3709:
3692:differentiated
3667:
3663:
3641:
3636:
3632:
3628:
3625:
3622:
3617:
3613:
3609:
3606:
3603:
3600:
3597:
3575:
3571:
3567:
3564:
3561:
3556:
3552:
3536:
3535:
3517:
3513:
3508:
3502:
3497:
3494:
3491:
3487:
3478:
3474:
3469:
3463:
3460:
3455:
3451:
3447:
3444:
3441:
3436:
3432:
3428:
3425:
3422:
3419:
3416:
3402:
3401:
3383:
3378:
3373:
3368:
3363:
3357:
3351:
3346:
3343:
3340:
3336:
3327:
3322:
3317:
3312:
3307:
3301:
3295:
3292:
3289:
3286:
3281:
3277:
3273:
3270:
3258:Or generally:
3256:
3255:
3243:
3240:
3237:
3232:
3218:
3213:
3208:
3203:
3198:
3192:
3186:
3181:
3178:
3175:
3171:
3162:
3157:
3152:
3147:
3142:
3136:
3130:
3127:
3124:
3121:
3116:
3112:
3108:
3105:
3081:
3070:
3064:
3063:
3048:
3043:
3038:
3033:
3028:
3022:
3016:
3011:
3008:
3005:
3001:
2997:
2994:
2980:
2979:
2964:
2959:
2954:
2949:
2944:
2938:
2932:
2927:
2924:
2921:
2917:
2911:
2908:
2902:
2894:
2889:
2884:
2879:
2874:
2868:
2862:
2859:
2852:
2847:
2844:
2841:
2837:
2832:
2828:
2825:
2822:
2817:
2813:
2809:
2806:
2801:
2796:
2793:
2790:
2786:
2782:
2779:
2763:is called the
2757:
2756:
2744:
2741:
2738:
2733:
2722:
2717:
2712:
2707:
2702:
2696:
2690:
2687:
2682:
2679:
2676:
2673:
2668:
2664:
2660:
2657:
2638:
2637:
2626:
2623:
2620:
2617:
2614:
2609:
2605:
2601:
2598:
2593:
2588:
2585:
2582:
2578:
2550:
2547:
2544:
2541:
2530:
2529:
2517:
2514:
2511:
2506:
2499:
2496:
2493:
2490:
2485:
2480:
2475:
2470:
2465:
2460:
2457:
2454:
2451:
2446:
2442:
2438:
2435:
2432:
2429:
2398:
2395:
2364:regularization
2350:
2342:
2339:
2331:
2330:
2318:
2315:
2312:
2307:
2293:
2288:
2283:
2278:
2273:
2267:
2261:
2258:
2255:
2250:
2247:
2244:
2240:
2236:
2233:
2225:
2220:
2215:
2210:
2205:
2199:
2193:
2190:
2187:
2184:
2179:
2175:
2171:
2168:
2154:
2153:
2134:
2129:
2124:
2119:
2114:
2108:
2102:
2099:
2096:
2091:
2088:
2085:
2081:
2077:
2074:
2070:
2064:
2060:
2057:
2054:
2049:
2045:
2041:
2038:
2033:
2024:
2019:
2014:
2009:
2004:
1998:
1992:
1989:
1986:
1981:
1977:
1973:
1970:
1964:
1961:
1958:
1953:
1950:
1947:
1943:
1939:
1936:
1932:
1928:
1925:
1922:
1917:
1913:
1909:
1906:
1901:
1898:
1895:
1890:
1887:
1884:
1880:
1876:
1873:
1869:
1865:
1862:
1859:
1854:
1850:
1846:
1843:
1825:
1824:
1813:
1810:
1807:
1802:
1791:
1786:
1781:
1776:
1771:
1765:
1759:
1756:
1753:
1748:
1744:
1740:
1737:
1732:
1728:
1725:
1722:
1717:
1713:
1709:
1706:
1685:
1684:
1672:
1669:
1666:
1661:
1652:
1647:
1642:
1637:
1632:
1626:
1619:
1616:
1613:
1608:
1604:
1600:
1597:
1592:
1589:
1586:
1581:
1577:
1573:
1570:
1564:
1561:
1526:
1523:
1504:
1499:
1471:
1466:
1453:
1452:
1441:
1436:
1431:
1426:
1421:
1416:
1411:
1408:
1405:
1402:
1399:
1396:
1393:
1348:
1345:
1342:
1338:
1326:
1325:
1314:
1309:
1306:
1303:
1299:
1293:
1290:
1287:
1283:
1279:
1276:
1273:
1268:
1265:
1262:
1258:
1252:
1249:
1246:
1242:
1238:
1233:
1230:
1227:
1223:
1217:
1214:
1211:
1207:
1203:
1198:
1195:
1192:
1188:
1184:
1181:
1178:
1175:
1172:
1169:
1166:
1135:
1132:
1129:
1126:
1123:
1120:
1105:
1102:
1101:
1100:
1096:
1093:blood pressure
1058:
1041:
1032:
1016:(ranging from
1005:
1002:
973:
970:
897:
880:
867:
861:
860:
849:
844:
839:
834:
829:
824:
819:
816:
813:
810:
805:
800:
795:
792:
789:
750:
747:
739:
736:
683:
680:
666:These are all
664:
663:
660:
657:
654:
651:
642:(equivalently
631:
628:
567:classification
553:
552:
550:
549:
542:
535:
527:
524:
523:
522:
521:
506:
505:
504:
503:
498:
493:
488:
483:
478:
470:
469:
465:
464:
463:
462:
457:
452:
447:
442:
434:
433:
432:
431:
426:
421:
416:
411:
403:
402:
401:
400:
395:
390:
385:
377:
376:
375:
374:
369:
364:
356:
355:
351:
350:
349:
348:
340:
339:
338:
337:
332:
327:
322:
317:
312:
307:
302:
300:Semiparametric
297:
292:
284:
283:
282:
281:
276:
271:
269:Random effects
266:
261:
253:
252:
251:
250:
245:
243:Ordered probit
240:
235:
230:
225:
220:
215:
210:
205:
200:
195:
190:
182:
181:
180:
179:
174:
169:
164:
156:
155:
151:
150:
144:
143:
135:
134:
49:
47:
40:
26:
24:
14:
13:
10:
9:
6:
4:
3:
2:
7906:
7895:
7892:
7890:
7887:
7885:
7882:
7881:
7879:
7865:
7860:
7856:
7852:
7845:
7838:
7835:
7830:
7823:
7820:
7814:
7809:
7805:
7801:
7797:
7790:
7787:
7784:
7779:
7776:
7771:
7767:
7763:
7759:
7758:
7750:
7747:
7742:
7740:9780471528890
7736:
7732:
7725:
7722:
7714:
7713:
7705:
7703:
7699:
7694:
7692:9780761922087
7688:
7684:
7679:
7678:
7669:
7666:
7661:
7657:
7653:
7649:
7642:
7639:
7634:
7628:
7624:
7620:
7614:
7611:
7605:
7601:
7598:
7596:
7593:
7592:
7588:
7586:
7584:
7580:
7576:
7572:
7568:
7564:
7556:
7554:
7541:
7532:
7529:
7524:
7520:
7513:
7507:
7504:
7497:
7493:
7489:
7486:
7482:
7476:
7471:
7468:
7465:
7461:
7455:
7450:
7447:
7444:
7440:
7436:
7433:
7430:
7427:
7424:
7421:
7385:
7376:
7372:
7368:
7365:
7357:
7351:
7346:
7339:
7335:
7331:
7328:
7324:
7303:
7283:
7263:
7259:
7249:
7245:
7241:
7238:
7234:
7225:
7222:
7217:
7213:
7206:
7201:
7196:
7193:
7190:
7186:
7181:
7175:
7170:
7167:
7164:
7160:
7156:
7148:
7144:
7140:
7135:
7131:
7124:
7119:
7114:
7111:
7108:
7104:
7100:
7097:
7090:
7089:
7088:
7085:
7069:
7065:
7061:
7058:
7055:
7050:
7046:
7037:
7021:
7018:
7015:
7012:
7009:
7006:
7003:
6982:
6979:
6976:
6973:
6970:
6967:
6963:
6958:
6954:
6941:
6939:
6936:
6933:
6925:
6923:
6921:
6913:
6908:
6905:
6888:
6882:
6879:
6876:
6870:
6867:
6864:
6861:
6858:
6835:
6832:
6829:
6823:
6820:
6817:
6814:
6806:
6802:
6799:
6795:
6791:
6775:
6769:
6766:
6763:
6757:
6754:
6751:
6748:
6745:
6742:
6719:
6716:
6713:
6707:
6702:
6698:
6694:
6691:
6668:
6665:
6662:
6656:
6651:
6647:
6643:
6640:
6632:
6631:
6630:
6606:
6603:
6600:
6597:
6594:
6591:
6588:
6574:
6570:
6566:
6561:
6557:
6553:
6548:
6538:
6530:
6520:
6515:
6496:
6494:
6483:
6480:
6477:
6474:
6471:
6468:
6465:
6453:
6450:
6442:
6438:
6434:
6429:
6419:
6414:
6401:
6396:
6392:
6388:
6383:
6373:
6368:
6352:
6350:
6339:
6336:
6333:
6330:
6327:
6324:
6321:
6309:
6306:
6301:
6296:
6293:
6290:
6286:
6282:
6277:
6272:
6269:
6266:
6262:
6252:
6250:
6239:
6236:
6233:
6230:
6227:
6224:
6221:
6207:
6202:
6199:
6196:
6192:
6188:
6183:
6178:
6175:
6172:
6168:
6158:
6156:
6148:
6145:
6140:
6136:
6118:
6117:
6116:
6099:
6096:
6093:
6088:
6076:
6071:
6068:
6065:
6061:
6057:
6049:
6044:
6041:
6038:
6034:
6030:
6027:
6024:
6019:
6014:
6011:
6008:
6004:
6000:
5995:
5990:
5987:
5984:
5980:
5963:
5956:
5953:
5948:
5944:
5930:
5929:
5928:
5902:
5897:
5894:
5891:
5888:
5885:
5881:
5877:
5872:
5867:
5864:
5861:
5857:
5848:
5838:
5833:
5830:
5827:
5823:
5819:
5814:
5809:
5806:
5803:
5799:
5788:
5783:
5780:
5777:
5773:
5769:
5764:
5759:
5756:
5753:
5749:
5739:
5737:
5729:
5726:
5721:
5717:
5702:
5690:
5685:
5682:
5679:
5675:
5671:
5666:
5661:
5658:
5655:
5651:
5642:
5632:
5627:
5624:
5621:
5617:
5613:
5608:
5603:
5600:
5597:
5593:
5582:
5577:
5574:
5571:
5567:
5563:
5558:
5553:
5550:
5547:
5543:
5533:
5531:
5523:
5520:
5515:
5511:
5492:
5487:
5484:
5481:
5477:
5473:
5468:
5463:
5460:
5457:
5453:
5444:
5434:
5429:
5426:
5423:
5419:
5415:
5410:
5405:
5402:
5399:
5395:
5384:
5379:
5376:
5373:
5369:
5365:
5360:
5355:
5352:
5349:
5345:
5335:
5333:
5325:
5322:
5317:
5313:
5295:
5294:
5293:
5291:
5287:
5269:
5264:
5261:
5258:
5254:
5245:
5227:
5223:
5214:
5210:
5206:
5201:
5199:
5183:
5177:
5174:
5171:
5165:
5160:
5156:
5152:
5147:
5143:
5119:
5116:
5113:
5108:
5099:
5095:
5091:
5086:
5076:
5071:
5061:
5056:
5051:
5048:
5045:
5041:
5033:
5032:
5031:
5029:
5024:
5020:
5017:
5013:
5009:
5004:
5002:
4998:
4994:
4986:
4984:
4982:
4963:
4960:
4957:
4952:
4938:
4928:
4924:
4920:
4909:
4903:
4900:
4897:
4892:
4889:
4886:
4882:
4878:
4875:
4867:
4857:
4853:
4849:
4838:
4832:
4826:
4823:
4818:
4814:
4800:
4799:
4798:
4777:
4774:
4772:
4766:
4762:
4748:
4745:
4742:
4738:
4730:
4720:
4715:
4705:
4703:
4697:
4693:
4675:
4674:
4673:
4671:
4667:
4649:
4639:
4636:
4633:
4599:
4589:
4584:
4573:
4567:
4562:
4559:
4556:
4552:
4543:
4533:
4528:
4517:
4511:
4509:
4494:
4484:
4479:
4468:
4462:
4457:
4454:
4451:
4447:
4439:
4429:
4426:
4422:
4412:
4402:
4397:
4386:
4378:
4368:
4365:
4361:
4354:
4352:
4337:
4327:
4324:
4320:
4312:
4302:
4297:
4286:
4280:
4275:
4272:
4269:
4265:
4255:
4245:
4242:
4238:
4230:
4220:
4215:
4204:
4197:
4195:
4183:
4173:
4167:
4164:
4159:
4145:
4139:
4134:
4131:
4128:
4124:
4115:
4105:
4099:
4096:
4091:
4077:
4063:
4062:
4061:
4047:
4044:
4041:
4021:
4018:
4015:
4007:
3989:
3985:
3975:
3973:
3949:
3939:
3934:
3924:
3921:
3918:
3913:
3903:
3898:
3888:
3885:
3879:
3876:
3873:
3867:
3864:
3859:
3855:
3841:
3840:
3839:
3815:
3804:
3797:
3789:
3785:
3781:
3778:
3775:
3770:
3766:
3759:
3752:
3749:
3745:
3742:
3730:
3724:
3719:
3713:
3707:
3700:
3699:
3698:
3697:
3693:
3689:
3685:
3680:
3665:
3661:
3634:
3630:
3626:
3623:
3620:
3615:
3611:
3607:
3604:
3598:
3595:
3573:
3569:
3565:
3562:
3559:
3554:
3550:
3541:
3515:
3511:
3506:
3500:
3495:
3492:
3489:
3485:
3476:
3472:
3467:
3461:
3453:
3449:
3445:
3442:
3439:
3434:
3430:
3426:
3423:
3417:
3414:
3407:
3406:
3405:
3381:
3371:
3366:
3355:
3349:
3344:
3341:
3338:
3334:
3325:
3315:
3310:
3299:
3293:
3287:
3284:
3279:
3275:
3261:
3260:
3259:
3241:
3238:
3235:
3230:
3216:
3206:
3201:
3190:
3184:
3179:
3176:
3173:
3169:
3160:
3150:
3145:
3134:
3128:
3122:
3119:
3114:
3110:
3096:
3095:
3094:
3091:
3089:
3084:
3080:
3079:
3073:
3069:
3046:
3036:
3031:
3020:
3014:
3009:
3006:
3003:
2999:
2995:
2992:
2985:
2984:
2983:
2962:
2952:
2947:
2936:
2930:
2925:
2922:
2919:
2915:
2909:
2906:
2900:
2892:
2882:
2877:
2866:
2860:
2857:
2850:
2845:
2842:
2839:
2835:
2830:
2823:
2820:
2815:
2811:
2799:
2794:
2791:
2788:
2784:
2780:
2777:
2770:
2769:
2768:
2766:
2762:
2759:The quantity
2742:
2739:
2736:
2731:
2720:
2710:
2705:
2694:
2688:
2685:
2680:
2674:
2671:
2666:
2662:
2648:
2647:
2646:
2644:
2643:Gibbs measure
2624:
2621:
2615:
2612:
2607:
2603:
2591:
2586:
2583:
2580:
2576:
2568:
2567:
2566:
2564:
2548:
2545:
2542:
2539:
2515:
2512:
2509:
2504:
2497:
2494:
2491:
2488:
2483:
2473:
2468:
2458:
2452:
2449:
2444:
2440:
2430:
2427:
2420:
2419:
2418:
2416:
2412:
2408:
2404:
2396:
2394:
2392:
2388:
2384:
2380:
2376:
2372:
2369:
2365:
2361:
2357:
2353:
2349:
2340:
2338:
2336:
2316:
2313:
2310:
2305:
2291:
2281:
2276:
2265:
2259:
2256:
2253:
2248:
2245:
2242:
2238:
2234:
2231:
2223:
2213:
2208:
2197:
2191:
2185:
2182:
2177:
2173:
2159:
2158:
2157:
2132:
2122:
2117:
2106:
2100:
2097:
2094:
2089:
2086:
2083:
2079:
2075:
2072:
2068:
2062:
2055:
2052:
2047:
2043:
2022:
2012:
2007:
1996:
1987:
1984:
1979:
1975:
1962:
1959:
1956:
1951:
1948:
1945:
1941:
1937:
1934:
1930:
1923:
1920:
1915:
1911:
1899:
1896:
1893:
1888:
1885:
1882:
1878:
1874:
1871:
1867:
1860:
1857:
1852:
1848:
1834:
1833:
1832:
1830:
1811:
1808:
1805:
1800:
1789:
1779:
1774:
1763:
1754:
1751:
1746:
1742:
1730:
1723:
1720:
1715:
1711:
1697:
1696:
1695:
1692:
1690:
1670:
1667:
1664:
1659:
1650:
1640:
1635:
1624:
1614:
1611:
1606:
1602:
1587:
1584:
1579:
1575:
1562:
1559:
1552:
1551:
1550:
1548:
1544:
1540:
1536:
1532:
1524:
1522:
1520:
1502:
1487:
1469:
1439:
1434:
1424:
1419:
1409:
1403:
1400:
1397:
1391:
1384:
1383:
1382:
1380:
1376:
1372:
1368:
1364:
1346:
1343:
1340:
1336:
1312:
1307:
1304:
1301:
1297:
1291:
1288:
1285:
1281:
1277:
1274:
1271:
1266:
1263:
1260:
1256:
1250:
1247:
1244:
1240:
1236:
1231:
1228:
1225:
1221:
1215:
1212:
1209:
1205:
1201:
1196:
1193:
1190:
1186:
1182:
1176:
1173:
1170:
1164:
1157:
1156:
1155:
1153:
1149:
1130:
1127:
1124:
1118:
1111:
1103:
1097:
1094:
1090:
1086:
1085:
1084:
1081:
1079:
1075:
1070:
1066:
1061:
1057:
1053:
1049:
1044:
1040:
1035:
1031:
1027:
1023:
1019:
1015:
1011:
1003:
1001:
999:
995:
991:
987:
983:
979:
971:
969:
966:
961:
956:
952:
948:
944:
940:
936:
931:
929:
925:
921:
917:
913:
909:
905:
900:
896:
892:
888:
883:
879:
875:
870:
866:
847:
842:
832:
827:
817:
811:
808:
803:
790:
787:
780:
779:
778:
776:
772:
768:
764:
759:
757:
748:
745:
737:
735:
733:
729:
723:
721:
717:
713:
709:
705:
700:
698:
694:
690:
681:
679:
677:
673:
669:
661:
658:
655:
652:
649:
648:
647:
645:
641:
637:
629:
627:
625:
621:
617:
613:
609:
605:
603:
598:
597:multiclass LR
594:
593:polytomous LR
589:
587:
583:
580:
576:
572:
568:
564:
560:
548:
543:
541:
536:
534:
529:
528:
526:
525:
520:
515:
510:
509:
508:
507:
502:
499:
497:
494:
492:
489:
487:
484:
482:
479:
477:
474:
473:
472:
471:
466:
461:
458:
456:
453:
451:
448:
446:
443:
441:
438:
437:
436:
435:
430:
427:
425:
422:
420:
417:
415:
412:
410:
407:
406:
405:
404:
399:
396:
394:
391:
389:
386:
384:
381:
380:
379:
378:
373:
370:
368:
365:
363:
362:Least squares
360:
359:
358:
357:
352:
347:
344:
343:
342:
341:
336:
333:
331:
328:
326:
323:
321:
318:
316:
313:
311:
308:
306:
303:
301:
298:
296:
295:Nonparametric
293:
291:
288:
287:
286:
285:
280:
277:
275:
272:
270:
267:
265:
264:Fixed effects
262:
260:
257:
256:
255:
254:
249:
246:
244:
241:
239:
238:Ordered logit
236:
234:
231:
229:
226:
224:
221:
219:
216:
214:
211:
209:
206:
204:
201:
199:
196:
194:
191:
189:
186:
185:
184:
183:
178:
175:
173:
170:
168:
165:
163:
160:
159:
158:
157:
152:
149:
145:
141:
140:
131:
128:
120:
117:November 2011
109:
106:
102:
99:
95:
92:
88:
85:
81:
78: –
77:
73:
72:Find sources:
66:
62:
56:
55:
50:This article
48:
44:
39:
38:
33:
19:
7854:
7850:
7837:
7828:
7822:
7803:
7799:
7789:
7778:
7761:
7755:
7749:
7730:
7724:
7711:
7676:
7668:
7651:
7647:
7641:
7622:
7613:
7574:
7560:
7413:
7086:
6945:
6937:
6929:
6917:
6911:
6628:
6114:
5926:
5285:
5243:
5212:
5208:
5202:
5134:
5022:
5018:
5011:
5007:
5005:
4990:
4980:
4978:
4796:
4669:
4665:
4625:
4006:identifiable
3976:
3969:
3837:
3653:
3537:
3403:
3257:
3092:
3088:optimization
3082:
3077:
3076:
3071:
3067:
3065:
2981:
2760:
2758:
2639:
2531:
2400:
2393:algorithms.
2347:
2346:
2344:
2332:
2155:
1828:
1826:
1693:
1686:
1546:
1542:
1538:
1534:
1530:
1528:
1518:
1485:
1454:
1378:
1370:
1366:
1327:
1151:
1150:has outcome
1147:
1107:
1082:
1077:
1073:
1068:
1059:
1055:
1042:
1038:
1033:
1029:
1025:
1021:
1017:
1013:
1009:
1007:
993:
988:rather than
975:
954:
950:
932:
927:
923:
911:
910:to category
907:
903:
898:
894:
893:, and score(
890:
881:
877:
873:
868:
864:
862:
760:
752:
749:Introduction
728:nested logit
724:
715:
711:
707:
701:
697:collinearity
695:); however,
685:
665:
643:
633:
623:
619:
615:
611:
607:
600:
596:
592:
590:
562:
556:
419:Non-negative
217:
123:
114:
104:
97:
90:
83:
71:
59:Please help
54:verification
51:
6932:odds ratios
5012:k=1,2,...,K
3090:procedure.
2982:Therefore:
1052:categorical
1004:Data points
986:categorical
947:probability
937:algorithm,
775:dot product
682:Assumptions
644:categorical
429:Regularized
393:Generalized
325:Least angle
223:Mixed logit
7878:Categories
7606:References
5290:continuous
935:perceptron
742:See also:
630:Background
604:regression
559:statistics
468:Background
372:Non-linear
354:Estimation
87:newspapers
7508:
7483:δ
7462:∑
7441:∑
7437:−
7428:
7422:−
7325:δ
7235:δ
7187:∏
7161:∏
7105:∏
7059:…
7016:…
6980:…
6964:∈
6871:
6865:∼
6824:
6818:∼
6758:
6752:∼
6746:−
6708:
6695:∼
6657:
6644:∼
6601:…
6583:∀
6571:ε
6567:−
6558:ε
6539:⋅
6526:β
6521:−
6511:β
6478:…
6460:∀
6439:ε
6420:⋅
6410:β
6402:−
6393:ε
6374:⋅
6364:β
6334:…
6316:∀
6302:∗
6283:−
6278:∗
6234:…
6216:∀
6208:∗
6184:∗
6097:≤
6077:∗
6050:∗
6028:…
6020:∗
5996:∗
5903:∗
5895:−
5873:∗
5849:⋯
5839:∗
5815:∗
5789:∗
5765:∗
5703:⋯
5691:∗
5667:∗
5643:⋯
5633:∗
5609:∗
5583:∗
5559:∗
5493:∗
5469:∗
5445:⋯
5435:∗
5411:∗
5385:∗
5361:∗
5270:∗
5166:
5153:∼
5144:ε
5117:≤
5096:ε
5077:⋅
5067:β
5057:∗
4961:≤
4929:⋅
4916:β
4901:−
4883:∑
4858:⋅
4845:β
4758:β
4726:β
4721:−
4711:β
4689:β
4645:β
4640:−
4590:⋅
4580:β
4553:∑
4534:⋅
4524:β
4485:⋅
4475:β
4448:∑
4430:⋅
4403:⋅
4393:β
4369:⋅
4328:⋅
4303:⋅
4293:β
4266:∑
4246:⋅
4221:⋅
4211:β
4174:⋅
4155:β
4125:∑
4106:⋅
4087:β
4045:−
4019:−
3986:β
3940:⋅
3930:β
3922:…
3904:⋅
3894:β
3880:
3811:otherwise
3779:…
3760:
3753:
3624:…
3599:
3563:…
3486:∑
3443:…
3418:
3372:⋅
3362:β
3335:∑
3316:⋅
3306:β
3239:≤
3207:⋅
3197:β
3170:∑
3151:⋅
3141:β
3037:⋅
3027:β
3000:∑
2953:⋅
2943:β
2916:∑
2883:⋅
2873:β
2836:∑
2785:∑
2740:≤
2711:⋅
2701:β
2577:∑
2546:
2540:−
2513:≤
2495:
2489:−
2474:⋅
2464:β
2431:
2407:logarithm
2282:⋅
2272:β
2257:−
2239:∑
2214:⋅
2204:β
2123:⋅
2113:β
2098:−
2080:∑
2032:⇒
2013:⋅
2003:β
1960:−
1942:∑
1938:−
1897:−
1879:∑
1875:−
1780:⋅
1770:β
1641:⋅
1631:β
1563:
1465:β
1425:⋅
1415:β
1337:β
1282:β
1275:⋯
1241:β
1206:β
1187:β
1089:hepatitis
1000:article.
833:⋅
823:β
791:
335:Segmented
7621:(2012).
7589:See also
7575:features
6868:Logistic
6821:Logistic
6755:Logistic
4925:′
4854:′
4767:′
4698:′
2368:Gaussian
1054:outcome
450:Bayesian
388:Weighted
383:Ordinary
315:Isotonic
310:Quantile
5205:utility
3877:softmax
3596:softmax
3415:softmax
920:utility
730:or the
640:nominal
614:), the
602:softmax
409:Partial
248:Poisson
101:scholar
7737:
7689:
7629:
6586:
6580:
6463:
6457:
6319:
6313:
6219:
6213:
5135:where
2387:L-BFGS
2362:using
1488:, and
1455:where
1328:where
990:binary
914:. In
863:where
620:MaxEnt
612:mlogit
367:Linear
305:Robust
228:Probit
154:Models
103:
96:
89:
82:
74:
7847:(PDF)
7716:(PDF)
6851:then
6735:then
1361:is a
972:Setup
788:score
738:Model
565:is a
414:Total
330:Local
108:JSTOR
94:books
7735:ISBN
7687:ISBN
7627:ISBN
6996:for
6684:and
6554:>
6451:>
6307:>
6189:>
5878:>
5820:>
5770:>
5672:>
5614:>
5564:>
5474:>
5416:>
5366:>
4746:<
2314:<
1809:<
1668:<
1037:...
984:are
80:news
7859:doi
7808:doi
7766:doi
7656:doi
7561:In
7505:log
7425:log
5974:max
5023:i,k
3756:max
3750:arg
1379:M+1
1043:M,i
1034:1,i
1020:to
573:to
557:In
63:by
7880::
7855:85
7853:.
7849:.
7804:43
7802:.
7798:.
7762:51
7760:.
7701:^
7685:.
7683:91
7652:42
7650:.
7585:.
7084:.
6699:EV
6648:EV
6500:Pr
6356:Pr
6256:Pr
6162:Pr
6130:Pr
5968:Pr
5938:Pr
5743:Pr
5711:Pr
5537:Pr
5505:Pr
5339:Pr
5307:Pr
5200:.
5157:EV
4808:Pr
3849:Pr
3737:if
3269:Pr
3104:Pr
2805:Pr
2656:Pr
2645::
2597:Pr
2543:ln
2492:ln
2434:Pr
2428:ln
2417::
2377:,
2167:Pr
2037:Pr
1969:Pr
1905:Pr
1842:Pr
1736:Pr
1705:Pr
1596:Pr
1569:Pr
1560:ln
1521:.
941:,
902:,
876:,
777::
626:.
606:,
599:,
595:,
561:,
7867:.
7861::
7816:.
7810::
7772:.
7768::
7743:.
7695:.
7662:.
7658::
7635:.
7542:.
7539:)
7536:)
7533:j
7530:=
7525:i
7521:Y
7517:(
7514:P
7511:(
7498:i
7494:y
7490:,
7487:j
7477:K
7472:1
7469:=
7466:j
7456:n
7451:1
7448:=
7445:i
7434:=
7431:L
7386:0
7377:i
7373:y
7369:=
7366:j
7358:1
7352:{
7347:=
7340:i
7336:y
7332:,
7329:j
7304:j
7284:i
7264:,
7260:)
7250:i
7246:y
7242:,
7239:j
7230:)
7226:j
7223:=
7218:i
7214:Y
7210:(
7207:P
7202:K
7197:1
7194:=
7191:j
7182:(
7176:n
7171:1
7168:=
7165:i
7157:=
7154:)
7149:i
7145:y
7141:=
7136:i
7132:Y
7128:(
7125:P
7120:n
7115:1
7112:=
7109:i
7101:=
7098:L
7070:n
7066:Y
7062:,
7056:,
7051:1
7047:Y
7022:n
7019:,
7013:,
7010:1
7007:=
7004:i
6983:K
6977:,
6974:1
6971:,
6968:0
6959:i
6955:y
6912:K
6889:.
6886:)
6883:b
6880:,
6877:0
6874:(
6862:X
6859:b
6839:)
6836:1
6833:,
6830:0
6827:(
6815:X
6776:.
6773:)
6770:b
6767:,
6764:0
6761:(
6749:Y
6743:X
6723:)
6720:b
6717:,
6714:a
6711:(
6703:1
6692:Y
6672:)
6669:b
6666:,
6663:a
6660:(
6652:1
6641:X
6610:)
6607:K
6604:,
6598:,
6595:2
6592:=
6589:k
6575:1
6562:k
6549:i
6544:X
6536:)
6531:k
6516:1
6506:(
6503:(
6497:=
6487:)
6484:K
6481:,
6475:,
6472:2
6469:=
6466:k
6454:0
6448:)
6443:k
6435:+
6430:i
6425:X
6415:k
6405:(
6397:1
6389:+
6384:i
6379:X
6369:1
6359:(
6353:=
6343:)
6340:K
6337:,
6331:,
6328:2
6325:=
6322:k
6310:0
6297:k
6294:,
6291:i
6287:Y
6273:1
6270:,
6267:i
6263:Y
6259:(
6253:=
6243:)
6240:K
6237:,
6231:,
6228:2
6225:=
6222:k
6203:k
6200:,
6197:i
6193:Y
6179:1
6176:,
6173:i
6169:Y
6165:(
6159:=
6152:)
6149:1
6146:=
6141:i
6137:Y
6133:(
6100:K
6094:k
6089:,
6082:)
6072:k
6069:,
6066:i
6062:Y
6058:=
6055:)
6045:K
6042:,
6039:i
6035:Y
6031:,
6025:,
6015:2
6012:,
6009:i
6005:Y
6001:,
5991:1
5988:,
5985:i
5981:Y
5977:(
5971:(
5964:=
5960:)
5957:k
5954:=
5949:i
5945:Y
5941:(
5908:)
5898:1
5892:K
5889:,
5886:i
5882:Y
5868:K
5865:,
5862:i
5858:Y
5834:2
5831:,
5828:i
5824:Y
5810:K
5807:,
5804:i
5800:Y
5784:1
5781:,
5778:i
5774:Y
5760:K
5757:,
5754:i
5750:Y
5746:(
5740:=
5733:)
5730:K
5727:=
5722:i
5718:Y
5714:(
5696:)
5686:K
5683:,
5680:i
5676:Y
5662:2
5659:,
5656:i
5652:Y
5628:3
5625:,
5622:i
5618:Y
5604:2
5601:,
5598:i
5594:Y
5578:1
5575:,
5572:i
5568:Y
5554:2
5551:,
5548:i
5544:Y
5540:(
5534:=
5527:)
5524:2
5521:=
5516:i
5512:Y
5508:(
5498:)
5488:K
5485:,
5482:i
5478:Y
5464:1
5461:,
5458:i
5454:Y
5430:3
5427:,
5424:i
5420:Y
5406:1
5403:,
5400:i
5396:Y
5380:2
5377:,
5374:i
5370:Y
5356:1
5353:,
5350:i
5346:Y
5342:(
5336:=
5329:)
5326:1
5323:=
5318:i
5314:Y
5310:(
5286:k
5265:k
5262:,
5259:i
5255:Y
5244:k
5228:i
5224:Y
5213:k
5209:i
5184:,
5181:)
5178:1
5175:,
5172:0
5169:(
5161:1
5148:k
5120:K
5114:k
5109:,
5100:k
5092:+
5087:i
5082:X
5072:k
5062:=
5052:k
5049:,
5046:i
5042:Y
5019:Y
5008:i
4981:K
4964:K
4958:k
4953:,
4939:i
4934:X
4921:j
4910:e
4904:1
4898:K
4893:1
4890:=
4887:j
4879:+
4876:1
4868:i
4863:X
4850:k
4839:e
4833:=
4830:)
4827:k
4824:=
4819:i
4815:Y
4811:(
4778:0
4775:=
4763:K
4749:K
4743:k
4739:,
4731:K
4716:k
4706:=
4694:k
4670:K
4666:K
4650:K
4637:=
4634:C
4600:i
4595:X
4585:k
4574:e
4568:K
4563:1
4560:=
4557:k
4544:i
4539:X
4529:c
4518:e
4512:=
4495:i
4490:X
4480:k
4469:e
4463:K
4458:1
4455:=
4452:k
4440:i
4435:X
4427:C
4423:e
4413:i
4408:X
4398:c
4387:e
4379:i
4374:X
4366:C
4362:e
4355:=
4338:i
4333:X
4325:C
4321:e
4313:i
4308:X
4298:k
4287:e
4281:K
4276:1
4273:=
4270:k
4256:i
4251:X
4243:C
4239:e
4231:i
4226:X
4216:c
4205:e
4198:=
4184:i
4179:X
4171:)
4168:C
4165:+
4160:k
4150:(
4146:e
4140:K
4135:1
4132:=
4129:k
4116:i
4111:X
4103:)
4100:C
4097:+
4092:c
4082:(
4078:e
4048:1
4042:k
4022:1
4016:k
3990:k
3955:)
3950:i
3945:X
3935:K
3925:,
3919:,
3914:i
3909:X
3899:1
3889:,
3886:c
3883:(
3874:=
3871:)
3868:c
3865:=
3860:i
3856:Y
3852:(
3816:.
3805:0
3798:,
3795:)
3790:n
3786:x
3782:,
3776:,
3771:1
3767:x
3763:(
3746:=
3743:k
3731:1
3725:{
3720:=
3717:)
3714:k
3711:(
3708:f
3666:k
3662:x
3640:)
3635:n
3631:x
3627:,
3621:,
3616:1
3612:x
3608:,
3605:k
3602:(
3574:n
3570:x
3566:,
3560:,
3555:1
3551:x
3516:i
3512:x
3507:e
3501:n
3496:1
3493:=
3490:i
3477:k
3473:x
3468:e
3462:=
3459:)
3454:n
3450:x
3446:,
3440:,
3435:1
3431:x
3427:,
3424:k
3421:(
3382:i
3377:X
3367:j
3356:e
3350:K
3345:1
3342:=
3339:j
3326:i
3321:X
3311:c
3300:e
3294:=
3291:)
3288:c
3285:=
3280:i
3276:Y
3272:(
3254:.
3242:K
3236:k
3231:,
3217:i
3212:X
3202:j
3191:e
3185:K
3180:1
3177:=
3174:j
3161:i
3156:X
3146:k
3135:e
3129:=
3126:)
3123:k
3120:=
3115:i
3111:Y
3107:(
3083:k
3078:β
3072:i
3068:Y
3047:i
3042:X
3032:k
3021:e
3015:K
3010:1
3007:=
3004:k
2996:=
2993:Z
2963:i
2958:X
2948:k
2937:e
2931:K
2926:1
2923:=
2920:k
2910:Z
2907:1
2901:=
2893:i
2888:X
2878:k
2867:e
2861:Z
2858:1
2851:K
2846:1
2843:=
2840:k
2831:=
2827:)
2824:k
2821:=
2816:i
2812:Y
2808:(
2800:K
2795:1
2792:=
2789:k
2781:=
2778:1
2761:Z
2755:.
2743:K
2737:k
2732:,
2721:i
2716:X
2706:k
2695:e
2689:Z
2686:1
2681:=
2678:)
2675:k
2672:=
2667:i
2663:Y
2659:(
2625:1
2622:=
2619:)
2616:k
2613:=
2608:i
2604:Y
2600:(
2592:K
2587:1
2584:=
2581:k
2549:Z
2528:.
2516:K
2510:k
2505:,
2498:Z
2484:i
2479:X
2469:k
2459:=
2456:)
2453:k
2450:=
2445:i
2441:Y
2437:(
2351:k
2348:β
2329:.
2317:K
2311:k
2306:,
2292:i
2287:X
2277:j
2266:e
2260:1
2254:K
2249:1
2246:=
2243:j
2235:+
2232:1
2224:i
2219:X
2209:k
2198:e
2192:=
2189:)
2186:k
2183:=
2178:i
2174:Y
2170:(
2152:.
2133:i
2128:X
2118:j
2107:e
2101:1
2095:K
2090:1
2087:=
2084:j
2076:+
2073:1
2069:1
2063:=
2059:)
2056:K
2053:=
2048:i
2044:Y
2040:(
2023:i
2018:X
2008:j
1997:e
1991:)
1988:K
1985:=
1980:i
1976:Y
1972:(
1963:1
1957:K
1952:1
1949:=
1946:j
1935:1
1931:=
1927:)
1924:j
1921:=
1916:i
1912:Y
1908:(
1900:1
1894:K
1889:1
1886:=
1883:j
1872:1
1868:=
1864:)
1861:K
1858:=
1853:i
1849:Y
1845:(
1829:K
1812:K
1806:k
1801:,
1790:i
1785:X
1775:k
1764:e
1758:)
1755:K
1752:=
1747:i
1743:Y
1739:(
1731:=
1727:)
1724:k
1721:=
1716:i
1712:Y
1708:(
1683:.
1671:K
1665:k
1660:,
1651:i
1646:X
1636:k
1625:=
1618:)
1615:K
1612:=
1607:i
1603:Y
1599:(
1591:)
1588:k
1585:=
1580:i
1576:Y
1572:(
1547:K
1543:K
1539:K
1535:K
1531:K
1519:i
1503:i
1498:x
1486:k
1470:k
1440:,
1435:i
1430:x
1420:k
1410:=
1407:)
1404:i
1401:,
1398:k
1395:(
1392:f
1371:k
1367:m
1347:k
1344:,
1341:m
1313:,
1308:i
1305:,
1302:M
1298:x
1292:k
1289:,
1286:M
1278:+
1272:+
1267:i
1264:,
1261:2
1257:x
1251:k
1248:,
1245:2
1237:+
1232:i
1229:,
1226:1
1222:x
1216:k
1213:,
1210:1
1202:+
1197:k
1194:,
1191:0
1183:=
1180:)
1177:i
1174:,
1171:k
1168:(
1165:f
1152:k
1148:i
1134:)
1131:i
1128:,
1125:k
1122:(
1119:f
1078:N
1074:K
1069:K
1060:i
1056:Y
1039:x
1030:x
1026:M
1022:N
1018:1
1014:i
1010:N
994:K
955:k
951:i
928:k
924:i
912:k
908:i
904:k
899:i
895:X
891:k
882:k
878:β
874:i
869:i
865:X
848:,
843:i
838:X
828:k
818:=
815:)
812:k
809:,
804:i
799:X
794:(
716:K
712:K
708:K
618:(
610:(
546:e
539:t
532:v
130:)
124:(
119:)
115:(
105:·
98:·
91:·
84:·
57:.
34:.
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.