71:
mechanism to a finite state model, parsing can be achieved much more efficiently. Instead of building an automaton for a particular sentence, a collection of transition graphs are built. A grammatically correct sentence is parsed by reaching a final state in any state graph. Transitions between these
75:
This model meets many of the goals set forth by the nature of language in that it captures the regularities of the language. That is, if there is a process that operates in a number of environments, the grammar should encapsulate the process in a single structure. Such encapsulation not only
72:
graphs are simply subroutine calls from one state to any initial state on any graph in the network. A sentence is determined to be grammatically correct if a final state is reached by the last word in the sentence.
80:
comes up. This means that not enough is yet known about the sentence. By the use of recursion, ATNs solve this inefficiency by postponing decisions until more is known about a sentence.
76:
simplifies the grammar, but has the added bonus of efficiency of operation. Another advantage of such a model is the ability to postpone decisions. Many grammars use guessing when an
206:
230:
225:
151:
Wanner, Eric; Maratsos, Michael (1978). "An ATN approach to comprehension". In M. Halle; J. Bresnan; G.A. Miller (eds.).
67:) to parse sentences. W. A. Woods in "Transition Network Grammars for Natural Language Analysis" claims that by adding a
109:
53:
179:
45:
29:
89:
94:
60:
49:
139:
188:
131:
41:
33:
159:
99:
219:
135:
64:
25:
171:
52:, however complicated. ATN are modified transition networks and an extension of
122:
Wanner, Eric (1980). "The ATN and the
Sausage Machine: which one is baloney?".
77:
68:
192:
143:
210:
104:
37:
172:"Transition Network Grammars for Natural Language Analysis"
164:Language as a Cognitive Process, Volume 1: Syntax
8:
153:Linguistic Theory and Psychological Reality
48:. An ATN can, theoretically, analyze the
207:An introduction on ATNs by Paul Graham
7:
14:
44:, and having wide application in
59:ATNs build on the idea of using
166:, Addison–Wesley, Reading, MA.
1:
136:10.1016/0010-0277(80)90013-X
110:Recursive transition network
18:augmented transition network
247:
180:Communications of the ACM
170:Woods, William A (1970).
50:structure of any sentence
231:Natural language parsing
155:. Cambridge: MIT Press.
46:artificial intelligence
226:Automata (computation)
30:operational definition
28:structure used in the
193:10.1145/355598.362773
90:Context free language
61:finite state machines
36:, used especially in
95:Finite state machine
40:relatively complex
42:natural languages
238:
196:
176:
156:
147:
34:formal languages
246:
245:
241:
240:
239:
237:
236:
235:
216:
215:
203:
187:(10): 591–606.
174:
169:
160:Winograd, Terry
150:
121:
118:
86:
26:graph theoretic
12:
11:
5:
244:
242:
234:
233:
228:
218:
217:
214:
213:
202:
201:External links
199:
198:
197:
167:
157:
148:
130:(2): 209–225.
117:
114:
113:
112:
107:
102:
100:Formal grammar
97:
92:
85:
82:
13:
10:
9:
6:
4:
3:
2:
243:
232:
229:
227:
224:
223:
221:
212:
208:
205:
204:
200:
194:
190:
186:
182:
181:
173:
168:
165:
161:
158:
154:
149:
145:
141:
137:
133:
129:
125:
120:
119:
115:
111:
108:
106:
103:
101:
98:
96:
93:
91:
88:
87:
83:
81:
79:
73:
70:
66:
62:
57:
55:
51:
47:
43:
39:
35:
31:
27:
24:is a type of
23:
19:
184:
178:
163:
152:
127:
123:
74:
65:Markov model
58:
21:
17:
15:
220:Categories
116:References
124:Cognition
78:ambiguity
69:recursive
162:(1983),
84:See also
211:On Lisp
144:7389289
105:Parsing
38:parsing
142:
175:(PDF)
140:PMID
54:RTNs
209:in
189:doi
132:doi
32:of
22:ATN
20:or
16:An
222::
185:13
183:.
177:.
138:.
126:.
56:.
195:.
191::
146:.
134::
128:8
63:(
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.