579:. For social networks, Liben-Nowell and Kleinberg proposed a link prediction models based on different graph proximity measures. Several statistical models have been proposed for link prediction by the machine learning and data mining community. For example, Popescul et al. proposed a structured logistic regression model that can make use of relational features. Local conditional probability models based on attribute and structural features were proposed by O’Madadhain et al. Several models based on directed graphical models for collective link prediction have been proposed by Getoor. Other approached based on random walks. and matrix factorization have also been proposed With the advent of deep learning, several graph embedding based approaches for link prediction have also been proposed. For more information on link prediction refer to the survey by Getoor et al. and Yu et al.
603:. Link prediction approaches can be divided into two broad categories based on the type of the underlying network: (1) link prediction approaches for homogeneous networks (2) link prediction approaches for heterogeneous networks. Based on the type of information used to predict links, approaches can be categorized as topology-based approaches, content-based approaches, and mixed methods.
1526:(PSL) is a probabilistic graphical model over hinge-loss Markov random field (HL-MRF). HL-MRFs are created by a set of templated first-order logic-like rules, which are then grounded over the data. PSL can combine attribute, or local, information with topological, or relational, information. While PSL can incorporate local predictors, such as
1552:(RMLs) is a neural network model created to provide a deep learning approach to the link weight prediction problem. This model uses a node embedding technique that extracts node embeddings (knowledge of nodes) from the known links’ weights (relations between nodes) and uses this knowledge to predict the unknown links’ weights.
1568:
In biology, link prediction has been used to predict interactions between proteins in protein-protein interaction networks. Link prediction has also been used to infer interactions between drugs and targets using link prediction
Another application is found in collaboration prediction in scientific
542:
Single link approaches learn a model that classifies each link independently. Structured prediction approaches capture the correlation between potential links by formulating the task as a collective link prediction task. Collective link prediction approaches learn a model that jointly identify all
1514:
A probabilistic relational model (PRM) specifies a template for a probability distribution over a databases. The template describes the relational schema for the domain, and the probabilistic dependencies between attributes in the domain. A PRM, together with a particular database of entities and
546:
Link prediction task can also be formulated as an instance of missing value estimation task. Here, the graph is represented as an adjacency matrix with missing values. The task is to complete the matrix by identifying the missing values. Matrix factorization based methods commonly use this
1541:(MLNs) is a probabilistic graphical model defined over Markov networks. These networks are defined by templated first-order logic-like rules, which is then grounded over the training data. MLNs are able to incorporate both local and relational rules for the purpose of link prediction.
1505:
similarity, or euclidean distance, hold in the embedding space. These similarities are functions of both topological features and attribute-based similarity. One can then use other machine learning techniques to predict edges on the basis of vector similarity.
1575:, also known as deduplication, commonly uses link prediction to predict whether two entities in a network are references to the same physical entity. Some authors have used context information in network structured domains to improve entity resolution.
85:. Link prediction is widely applicable. In e-commerce, link prediction is often a subtask for recommending items to users. In the curation of citation databases, it can be used for record deduplication. In bioinformatics, it has been used to predict
1327:
921:
1560:
Link prediction has found varied uses, but any domain in which entities interact in a structures way can benefit from link prediction. A common applications of link prediction is improving similarity measures for
987:
Neighbor based methods can be effective when the number of neighbors is large, but this is not the case in sparse graphs. In these situations it is appropriate to use methods that account for longer walks.
322:
In the binary classification formulation of the link prediction task the potential links are classified as either true links or false links. Link prediction approaches for this setting learn a classifier
788:
802:
is the sum of the log of the intersection of the neighbors of two nodes. This captures a two-hop similarity, which can yield better results than simple one-hop methods. It is computed as follows:
429:
1192:
686:
537:
1565:
approaches to recommendation. Link prediction is also frequently used in social networks to suggest friends to users. It has also been used to predict criminal associations.
245:. The goal of link prediction is to identify the unobserved true links. In the temporal formulation of link prediction the observed links correspond to true links at a time
189:
1844:
Backstrom, Lars; Leskovec, Jure (2011). "Supervised random walks: predicting and recommending links in social networks". In King, Irwin; Nejdl, Wolfgang; Li, Hang (eds.).
1145:
27:
is the problem of predicting the existence of a link between two entities in a network. Examples of link prediction include predicting friendship links among users in a
1204:
1478:
After normalizing the attribute values, computing the cosine between the two vectors is a good measure of similarity, with higher values indicating higher similarity.
1470:
The attribute values are represented as normalized vector and the distance between the vectors used to measure similarity. Small distances indicate higher similarity.
1057:
481:
431:. In the probability estimation formulation, potential links are associated with existence probabilities. Link prediction approaches for this setting learn a model
373:
317:
133:
1389:
1362:
1100:
456:
348:
953:
1620:
289:
219:
83:
1449:
1429:
1409:
1010:
977:
263:
239:
153:
57:
587:
Several link predication approaches have been proposed including unsupervised approaches such as similarity measures computed on the entity attributes,
808:
1997:
1901:
1721:
2098:
Bach, Stephen; Broecheler, Matthias; Huang, Bert; Getoor, Lise (2017). "Hinge-Loss Markov Random Fields and
Probabilistic Soft Logic".
1846:
Proceedings of the Fourth
International Conference on Web Search and Web Data Mining, WSDM 2011, Hong Kong, China, February 9-12, 2011
1676:
1501:, learn an embedding space in which neighboring nodes are represented by vectors so that vector similarity measures, such as
611:
Topology-based methods broadly make the assumption that nodes with similar network structure are more likely to form a link.
709:
2225:"Evaluation of different biological dataand computational classification methods for use in protein interaction prediction"
620:
1590:
1585:
1459:
86:
1635:
1928:
Xiao, Han; al., et. (2015). "From One Point to A Manifold: Knowledge Graph
Embedding For Precise Link Prediction".
1661:
1656:
1615:
2367:
1600:
1523:
89:(PPI). It is also used to identify hidden groups of terrorists and criminals in security related applications.
1075:
indicate the presence (or absence) of links between two nodes through intermediaries. For instance, in matrix
2377:
1562:
1150:
2273:
691:
A weakness of this approach is that it does not take into account the relative number of common neighbors.
2372:
1746:
1595:
572:
1878:
1775:
629:
378:
39:. Link prediction can also have a temporal aspect, where, given a snapshot of the set of links at time
1666:
1625:
1538:
592:
623:. Entities with more neighbors in common are more likely to have a link. It is computed as follows:
221:
represents the set of "true" links across entities in the network. We are given the set of entities
158:
703:
addresses the problem of Common
Neighbors by computing the relative number of neighbors in common:
555:
The task of link prediction has attracted attention from several research communities ranging from
1751:
1322:{\displaystyle C_{\mathrm {Katz} }(i)=\sum _{k=1}^{\infty }\sum _{j=1}^{n}\alpha ^{k}(A^{k})_{ji}}
1105:
486:
2343:
2205:
2107:
2080:
2045:
1933:
1907:
1849:
36:
2316:
Bhattacharya, Indrajit; Getoor, Lise (2007). "Collective entity resolution in relational data".
1737:
Liben-Nowell, David; Kleinberg, Jon (2007). "The Link-Prediction
Problem for Social Networks".
2298:
2254:
1993:
1897:
1717:
1572:
1527:
1026:
956:
799:
1549:
100:
2333:
2325:
2288:
2244:
2236:
2197:
2170:
2136:
2072:
2037:
1985:
1962:
1889:
1859:
1756:
1709:
1020:
992:
is one metric that captures this. It is computed by searching the graph for paths of length
596:
564:
32:
1367:
1335:
1078:
1012:
in the graph and adding the counts of each path length weighted by user specified weights.
434:
326:
2274:"A Probabilistic Approach for CollectiveSimilarity-based Drug-Drug Interaction Prediction"
1797:
1651:
1605:
1494:
989:
929:
560:
268:
194:
62:
1462:
methods predict the existence of a link based on the similarity of the node attributes.
461:
353:
297:
2249:
2224:
1646:
1434:
1414:
1394:
1147:, it indicates that node 2 and node 12 are connected through some walk of length 3. If
995:
962:
248:
224:
138:
42:
28:
20:
2041:
2361:
2084:
700:
600:
2293:
2209:
2156:"On Graph Mining With Deep Learning: Introducing Model R for Link Weight Prediction"
1911:
2049:
1698:
576:
2347:
1497:
also offer a convenient way to predict links. Graph embedding algorithms, such as
2155:
1893:
1713:
1515:
unobserved links, defines a probability distribution over the unobserved links.
1502:
588:
568:
1530:, it also supports relational rules, such as triangle completion in a network.
2140:
1989:
556:
2174:
2329:
2188:
Martinez, Victor (2016). "A Survey of Link
Prediction in Complex Networks".
1966:
1863:
1822:
1640:
2302:
2258:
1332:
Note that the above definition uses the fact that the element at location
916:{\displaystyle A(x,y)=\sum _{u\in N(x)\cap N(y)}{\frac {1}{\log |N(u)|}},}
2127:
Richardson, Matthew; Domingos, Pedro M. (2006). "Markov logic networks".
2063:
Katz, L. (1953). "A New Status Index
Derived from Sociometric Analysis".
1671:
1610:
1498:
619:
This is a common approach to link prediction that computes the number of
2240:
2076:
1888:. Lecture Notes in Computer Science. Vol. 6912. pp. 437–452.
1630:
2028:
Adamic, Luda; Adar, Etyan (2003). "Friends and neighbors on the web".
1821:
Getoor, Lise; Friedman, Nir; Koller, Daphne; Taskar, Benjamin (2002).
1805:
Journal of the
American Society for Information Science and Technology
1760:
1739:
Journal of the
American Society for Information Science and Technology
2338:
1953:
Getoor, Lise; Diehl, Christopher P. (2005). "Link mining: a survey".
2201:
2112:
1938:
1854:
319:, and we need to identify true links among these potential links.
1798:"Prediction and Ranking Algorithms forEvent-Based Network Data"
291:
Usually, we are also given a subset of unobserved links called
35:, and predicting interactions between genes and proteins in a
1796:
O’Madadhain, Joshua; Hutchins, Jon; Smyth, Padhraic (2005).
1783:
Workshop on Learning Statistical Models from Relational Data
1486:
Mixed methods combine attribute and topology based methods.
2272:
Shridar, Dhanya; Fakhraei, Shobeir; Getoor, Lise (2016).
575:
propose an approach to generate links between nodes in a
571:. In statistics, generative random graph models such as
265:, and the goal is to infer the set of true links at time
1980:
Yu, Philip S.; Han, Jiawei; Faloutsos, Christos (2010).
783:{\displaystyle J(A,B)={{|A\cap B|} \over {|A\cup B|}}}
1886:
Machine Learning and Knowledge Discovery in Databases
1776:"Statistical Relational Learning for Link Prediction"
1437:
1417:
1397:
1370:
1338:
1207:
1153:
1108:
1081:
1029:
998:
965:
932:
811:
712:
632:
595:
based approaches, and supervised approaches based on
489:
464:
437:
381:
356:
329:
300:
271:
251:
227:
197:
161:
141:
103:
65:
45:
241:
and a subset of true links which are referred to as
1443:
1423:
1403:
1383:
1356:
1321:
1186:
1139:
1094:
1051:
1004:
971:
947:
915:
782:
680:
531:
475:
450:
423:
367:
342:
311:
283:
257:
233:
213:
183:
147:
127:
77:
51:
2318:ACM Transactions on Knowledge Discovery from Data
2229:Proteins: Structure, Function, and Bioinformatics
1982:Link Mining: Models, Algorithms, and Applications
1823:"Learning Probabilistic Models of Link Structure"
543:the true links among the set of potential links.
1699:"A Survey of Link Prediction in Social Networks"
1923:
1921:
1816:
1814:
155:represents the entity nodes in the network and
1697:Hasan, Mohammad Al; Zaki, Mohammed J. (2011).
1621:List of datasets for machine learning research
8:
1063:are variables that take a value 1 if a node
418:
406:
1023:of a network under consideration. Elements
59:, the goal is to predict the links at time
1879:"Link prediction via matrix factorization"
1774:Popescul, Alexandrin; Ungar, Lyle (2002).
2337:
2292:
2248:
2154:Hou, Yuchen; Holder, Lawrence B. (2019).
2111:
1937:
1853:
1750:
1436:
1416:
1396:
1375:
1369:
1337:
1310:
1300:
1287:
1277:
1266:
1256:
1245:
1213:
1212:
1206:
1159:
1158:
1152:
1116:
1107:
1086:
1080:
1037:
1028:
997:
964:
931:
902:
885:
873:
837:
810:
772:
758:
757:
751:
737:
736:
734:
711:
672:
658:
657:
631:
494:
488:
463:
442:
436:
386:
380:
355:
334:
328:
299:
270:
250:
226:
206:
198:
196:
176:
168:
160:
140:
102:
64:
44:
1689:
1194:denotes Katz centrality of a node
1877:Menon, Aditya; Elkan, Charles (2011).
1187:{\displaystyle C_{\mathrm {Katz} }(i)}
31:, predicting co-authorship links in a
375:to positive and negative labels i.e.
7:
2100:Journal of Machine Learning Research
681:{\displaystyle CN(A,B)={|A\cap B|}}
424:{\displaystyle M_{b}:E'\to \{0,1\}}
2163:J. Artif. Intell. Soft Comput. Res
1257:
1223:
1220:
1217:
1214:
1169:
1166:
1163:
1160:
14:
1510:Probabilistic relationship models
1411:degree connections between nodes
1704:. In Aggarwal, Charu C. (ed.).
1677:Statistical relational learning
1643:, for other kinds of embeddings
1071:and 0 otherwise. The powers of
1708:. Springer. pp. 243–275.
1519:Probabilistic soft logic (PSL)
1351:
1339:
1307:
1293:
1235:
1229:
1181:
1175:
1128:
1109:
1046:
1030:
942:
936:
903:
899:
893:
886:
868:
862:
853:
847:
827:
815:
773:
759:
752:
738:
728:
716:
673:
659:
651:
639:
526:
514:
511:
403:
207:
199:
184:{\displaystyle E\subseteq |V|}
177:
169:
122:
110:
1:
2294:10.1093/bioinformatics/btw342
2042:10.1016/S0378-8733(03)00009-1
2017:. Springer. pp. 665–670.
1706:Social Network Data Analytics
1391:reflects the total number of
1894:10.1007/978-3-642-23783-6_28
1591:Graph (discrete mathematics)
1586:Similarity (network science)
1534:Markov logic networks (MLNs)
1455:Node attribute-based methods
1140:{\displaystyle (a_{2,12})=1}
532:{\displaystyle M_{p}:E'\to }
87:protein-protein interactions
1714:10.1007/978-1-4419-8462-3_9
1636:Fairness (machine learning)
2394:
1662:Regular map (graph theory)
1657:Doubly connected edge list
1616:Explanation-based learning
2141:10.1007/S10994-006-5833-1
1990:10.1007/978-1-4419-6515-8
1848:. ACM. pp. 635–644.
16:Problem in network theory
2175:10.2478/JAISCR-2018-0022
2013:Aggarwal, Charu (2015).
1601:Probabilistic soft logic
1569:co-authorship networks.
1524:Probabilistic soft logic
1052:{\displaystyle (a_{ij})}
2330:10.1145/1217299.1217304
1967:10.1145/1117454.1117456
1864:10.1145/1935826.1935914
1563:collaborative filtering
1198:, then mathematically:
573:stochastic block models
128:{\displaystyle G=(V,E)}
1596:Stochastic block model
1445:
1425:
1405:
1385:
1358:
1323:
1282:
1261:
1188:
1141:
1096:
1053:
1006:
973:
949:
917:
784:
682:
607:Topology-based methods
583:Approaches and methods
533:
483:to a probability i.e.
477:
452:
425:
369:
344:
313:
285:
259:
235:
215:
185:
149:
129:
79:
53:
2190:ACM Computing Surveys
1539:Markov logic networks
1446:
1426:
1406:
1386:
1384:{\displaystyle A^{k}}
1359:
1357:{\displaystyle (i,j)}
1324:
1262:
1241:
1189:
1142:
1097:
1095:{\displaystyle A^{3}}
1067:is connected to node
1054:
1007:
974:
959:of nodes adjacent to
950:
918:
785:
683:
534:
478:
453:
451:{\displaystyle M_{p}}
426:
370:
345:
343:{\displaystyle M_{b}}
314:
286:
260:
236:
216:
186:
150:
130:
80:
54:
1626:Predictive analytics
1435:
1415:
1395:
1368:
1336:
1205:
1151:
1106:
1079:
1027:
996:
963:
948:{\displaystyle N(u)}
930:
809:
710:
630:
593:matrix factorization
487:
462:
435:
379:
354:
327:
298:
269:
249:
225:
195:
159:
139:
101:
63:
43:
2223:Qi, Yanjun (2006).
1827:J. Mach. Learn. Res
800:Adamic–Adar measure
794:Adamic–Adar measure
458:that maps links in
350:that maps links in
284:{\displaystyle t+1}
214:{\displaystyle |V|}
97:Consider a network
78:{\displaystyle t+1}
2241:10.1002/prot.20865
2077:10.1007/BF02289026
1466:Euclidean distance
1441:
1421:
1401:
1381:
1354:
1319:
1184:
1137:
1092:
1049:
1002:
969:
945:
913:
872:
780:
678:
529:
476:{\displaystyle E'}
473:
448:
421:
368:{\displaystyle E'}
365:
340:
312:{\displaystyle E'}
309:
281:
255:
231:
211:
181:
145:
125:
93:Problem definition
75:
49:
37:biological network
2287:(20): 3175–3182.
1999:978-1-4419-6514-1
1903:978-3-642-23782-9
1761:10.1002/asi.20591
1723:978-1-4419-8461-6
1573:Entity resolution
1528:cosine similarity
1474:Cosine similarity
1444:{\displaystyle j}
1424:{\displaystyle i}
1404:{\displaystyle k}
1005:{\displaystyle t}
972:{\displaystyle u}
908:
833:
778:
258:{\displaystyle t}
234:{\displaystyle V}
148:{\displaystyle V}
52:{\displaystyle t}
2385:
2368:Graph algorithms
2352:
2351:
2341:
2313:
2307:
2306:
2296:
2278:
2269:
2263:
2262:
2252:
2220:
2214:
2213:
2185:
2179:
2178:
2160:
2151:
2145:
2144:
2135:(1–2): 107–136.
2124:
2118:
2117:
2115:
2095:
2089:
2088:
2060:
2054:
2053:
2025:
2019:
2018:
2010:
2004:
2003:
1977:
1971:
1970:
1950:
1944:
1943:
1941:
1925:
1916:
1915:
1883:
1874:
1868:
1867:
1857:
1841:
1835:
1834:
1818:
1809:
1808:
1802:
1793:
1787:
1786:
1780:
1771:
1765:
1764:
1754:
1745:(7): 1019–1031.
1734:
1728:
1727:
1703:
1694:
1495:Graph embeddings
1490:Graph embeddings
1450:
1448:
1447:
1442:
1430:
1428:
1427:
1422:
1410:
1408:
1407:
1402:
1390:
1388:
1387:
1382:
1380:
1379:
1363:
1361:
1360:
1355:
1328:
1326:
1325:
1320:
1318:
1317:
1305:
1304:
1292:
1291:
1281:
1276:
1260:
1255:
1228:
1227:
1226:
1193:
1191:
1190:
1185:
1174:
1173:
1172:
1146:
1144:
1143:
1138:
1127:
1126:
1101:
1099:
1098:
1093:
1091:
1090:
1058:
1056:
1055:
1050:
1045:
1044:
1021:adjacency matrix
1011:
1009:
1008:
1003:
990:The Katz Measure
978:
976:
975:
970:
954:
952:
951:
946:
922:
920:
919:
914:
909:
907:
906:
889:
874:
871:
789:
787:
786:
781:
779:
777:
776:
762:
756:
755:
741:
735:
687:
685:
684:
679:
677:
676:
662:
621:common neighbors
615:Common neighbors
597:graphical models
565:machine learning
538:
536:
535:
530:
510:
499:
498:
482:
480:
479:
474:
472:
457:
455:
454:
449:
447:
446:
430:
428:
427:
422:
402:
391:
390:
374:
372:
371:
366:
364:
349:
347:
346:
341:
339:
338:
318:
316:
315:
310:
308:
290:
288:
287:
282:
264:
262:
261:
256:
240:
238:
237:
232:
220:
218:
217:
212:
210:
202:
190:
188:
187:
182:
180:
172:
154:
152:
151:
146:
134:
132:
131:
126:
84:
82:
81:
76:
58:
56:
55:
50:
33:citation network
2393:
2392:
2388:
2387:
2386:
2384:
2383:
2382:
2358:
2357:
2356:
2355:
2315:
2314:
2310:
2276:
2271:
2270:
2266:
2222:
2221:
2217:
2202:10.1145/3012704
2187:
2186:
2182:
2158:
2153:
2152:
2148:
2126:
2125:
2121:
2097:
2096:
2092:
2062:
2061:
2057:
2030:Social Networks
2027:
2026:
2022:
2012:
2011:
2007:
2000:
1979:
1978:
1974:
1952:
1951:
1947:
1927:
1926:
1919:
1904:
1881:
1876:
1875:
1871:
1843:
1842:
1838:
1820:
1819:
1812:
1800:
1795:
1794:
1790:
1778:
1773:
1772:
1768:
1736:
1735:
1731:
1724:
1701:
1696:
1695:
1691:
1686:
1681:
1652:Graph thickness
1606:Graph embedding
1581:
1558:
1547:
1536:
1521:
1512:
1492:
1484:
1476:
1468:
1460:Node-similarity
1457:
1433:
1432:
1413:
1412:
1393:
1392:
1371:
1366:
1365:
1334:
1333:
1306:
1296:
1283:
1208:
1203:
1202:
1154:
1149:
1148:
1112:
1104:
1103:
1082:
1077:
1076:
1033:
1025:
1024:
994:
993:
985:
961:
960:
928:
927:
878:
807:
806:
796:
708:
707:
701:Jaccard Measure
697:
695:Jaccard measure
628:
627:
617:
609:
585:
561:network science
553:
503:
490:
485:
484:
465:
460:
459:
438:
433:
432:
395:
382:
377:
376:
357:
352:
351:
330:
325:
324:
301:
296:
295:
293:potential links
267:
266:
247:
246:
223:
222:
193:
192:
157:
156:
137:
136:
99:
98:
95:
61:
60:
41:
40:
25:link prediction
17:
12:
11:
5:
2391:
2389:
2381:
2380:
2378:Network theory
2375:
2370:
2360:
2359:
2354:
2353:
2308:
2281:Bioinformatics
2264:
2235:(3): 490–500.
2215:
2180:
2146:
2119:
2090:
2055:
2036:(3): 211–230.
2020:
2005:
1998:
1972:
1945:
1917:
1902:
1869:
1836:
1810:
1788:
1766:
1729:
1722:
1688:
1687:
1685:
1682:
1680:
1679:
1674:
1669:
1667:Fáry's theorem
1664:
1659:
1654:
1649:
1647:Book thickness
1644:
1638:
1633:
1628:
1623:
1618:
1613:
1608:
1603:
1598:
1593:
1588:
1582:
1580:
1577:
1557:
1554:
1546:
1545:R-Model (RMLs)
1543:
1535:
1532:
1520:
1517:
1511:
1508:
1491:
1488:
1483:
1480:
1475:
1472:
1467:
1464:
1456:
1453:
1440:
1420:
1400:
1378:
1374:
1353:
1350:
1347:
1344:
1341:
1330:
1329:
1316:
1313:
1309:
1303:
1299:
1295:
1290:
1286:
1280:
1275:
1272:
1269:
1265:
1259:
1254:
1251:
1248:
1244:
1240:
1237:
1234:
1231:
1225:
1222:
1219:
1216:
1211:
1183:
1180:
1177:
1171:
1168:
1165:
1162:
1157:
1136:
1133:
1130:
1125:
1122:
1119:
1115:
1111:
1089:
1085:
1048:
1043:
1040:
1036:
1032:
1001:
984:
981:
968:
944:
941:
938:
935:
924:
923:
912:
905:
901:
898:
895:
892:
888:
884:
881:
877:
870:
867:
864:
861:
858:
855:
852:
849:
846:
843:
840:
836:
832:
829:
826:
823:
820:
817:
814:
795:
792:
791:
790:
775:
771:
768:
765:
761:
754:
750:
747:
744:
740:
733:
730:
727:
724:
721:
718:
715:
696:
693:
689:
688:
675:
671:
668:
665:
661:
656:
653:
650:
647:
644:
641:
638:
635:
616:
613:
608:
605:
584:
581:
552:
549:
528:
525:
522:
519:
516:
513:
509:
506:
502:
497:
493:
471:
468:
445:
441:
420:
417:
414:
411:
408:
405:
401:
398:
394:
389:
385:
363:
360:
337:
333:
307:
304:
280:
277:
274:
254:
243:observed links
230:
209:
205:
201:
179:
175:
171:
167:
164:
144:
124:
121:
118:
115:
112:
109:
106:
94:
91:
74:
71:
68:
48:
29:social network
21:network theory
15:
13:
10:
9:
6:
4:
3:
2:
2390:
2379:
2376:
2374:
2373:Link analysis
2371:
2369:
2366:
2365:
2363:
2349:
2345:
2340:
2335:
2331:
2327:
2323:
2319:
2312:
2309:
2304:
2300:
2295:
2290:
2286:
2282:
2275:
2268:
2265:
2260:
2256:
2251:
2246:
2242:
2238:
2234:
2230:
2226:
2219:
2216:
2211:
2207:
2203:
2199:
2195:
2191:
2184:
2181:
2176:
2172:
2168:
2164:
2157:
2150:
2147:
2142:
2138:
2134:
2130:
2123:
2120:
2114:
2109:
2105:
2101:
2094:
2091:
2086:
2082:
2078:
2074:
2070:
2066:
2065:Psychometrika
2059:
2056:
2051:
2047:
2043:
2039:
2035:
2031:
2024:
2021:
2016:
2009:
2006:
2001:
1995:
1991:
1987:
1983:
1976:
1973:
1968:
1964:
1960:
1956:
1955:SIGKDD Explor
1949:
1946:
1940:
1935:
1931:
1924:
1922:
1918:
1913:
1909:
1905:
1899:
1895:
1891:
1887:
1880:
1873:
1870:
1865:
1861:
1856:
1851:
1847:
1840:
1837:
1832:
1828:
1824:
1817:
1815:
1811:
1806:
1799:
1792:
1789:
1784:
1777:
1770:
1767:
1762:
1758:
1753:
1752:10.1.1.58.689
1748:
1744:
1740:
1733:
1730:
1725:
1719:
1715:
1711:
1707:
1700:
1693:
1690:
1683:
1678:
1675:
1673:
1670:
1668:
1665:
1663:
1660:
1658:
1655:
1653:
1650:
1648:
1645:
1642:
1639:
1637:
1634:
1632:
1629:
1627:
1624:
1622:
1619:
1617:
1614:
1612:
1609:
1607:
1604:
1602:
1599:
1597:
1594:
1592:
1589:
1587:
1584:
1583:
1578:
1576:
1574:
1570:
1566:
1564:
1555:
1553:
1551:
1544:
1542:
1540:
1533:
1531:
1529:
1525:
1518:
1516:
1509:
1507:
1504:
1500:
1496:
1489:
1487:
1482:Mixed methods
1481:
1479:
1473:
1471:
1465:
1463:
1461:
1454:
1452:
1438:
1418:
1398:
1376:
1372:
1348:
1345:
1342:
1314:
1311:
1301:
1297:
1288:
1284:
1278:
1273:
1270:
1267:
1263:
1252:
1249:
1246:
1242:
1238:
1232:
1209:
1201:
1200:
1199:
1197:
1178:
1155:
1134:
1131:
1123:
1120:
1117:
1113:
1102:, if element
1087:
1083:
1074:
1070:
1066:
1062:
1041:
1038:
1034:
1022:
1018:
1013:
999:
991:
982:
980:
966:
958:
939:
933:
910:
896:
890:
882:
879:
875:
865:
859:
856:
850:
844:
841:
838:
834:
830:
824:
821:
818:
812:
805:
804:
803:
801:
793:
769:
766:
763:
748:
745:
742:
731:
725:
722:
719:
713:
706:
705:
704:
702:
694:
692:
669:
666:
663:
654:
648:
645:
642:
636:
633:
626:
625:
624:
622:
614:
612:
606:
604:
602:
601:deep learning
598:
594:
590:
582:
580:
578:
574:
570:
566:
562:
558:
550:
548:
547:formulation.
544:
540:
523:
520:
517:
507:
504:
500:
495:
491:
469:
466:
443:
439:
415:
412:
409:
399:
396:
392:
387:
383:
361:
358:
335:
331:
320:
305:
302:
294:
278:
275:
272:
252:
244:
228:
203:
173:
165:
162:
142:
119:
116:
113:
107:
104:
92:
90:
88:
72:
69:
66:
46:
38:
34:
30:
26:
22:
2321:
2317:
2311:
2284:
2280:
2267:
2232:
2228:
2218:
2193:
2189:
2183:
2169:(1): 21–40.
2166:
2162:
2149:
2132:
2128:
2122:
2103:
2099:
2093:
2068:
2064:
2058:
2033:
2029:
2023:
2014:
2008:
1984:. Springer.
1981:
1975:
1958:
1954:
1948:
1929:
1885:
1872:
1845:
1839:
1830:
1826:
1804:
1791:
1782:
1769:
1742:
1738:
1732:
1705:
1692:
1571:
1567:
1559:
1556:Applications
1548:
1537:
1522:
1513:
1493:
1485:
1477:
1469:
1458:
1331:
1195:
1072:
1068:
1064:
1060:
1016:
1014:
986:
983:Katz measure
925:
797:
698:
690:
618:
610:
586:
577:random graph
554:
545:
541:
321:
292:
242:
96:
24:
18:
2196:(4): 1–33.
2129:Mach. Learn
2015:Data Mining
1961:(2): 3–12.
1503:dot product
589:random walk
569:data mining
2362:Categories
2113:1505.04406
1939:1512.04792
1833:: 679–707.
1684:References
557:statistics
2339:1903/4241
2085:121768822
2071:: 39–43.
1855:1011.4071
1747:CiteSeerX
1641:Embedding
1285:α
1264:∑
1258:∞
1243:∑
883:
857:∩
842:∈
835:∑
767:∪
746:∩
667:∩
512:→
404:→
166:⊆
2303:27354693
2259:16450363
2210:14193467
2106:: 1–67.
1912:13892350
1672:Node2vec
1611:Big data
1579:See also
1550:R-Models
1499:Node2vec
508:′
470:′
400:′
362:′
306:′
135:, where
2250:3250929
2050:2262951
1631:Seq2seq
1019:be the
955:is the
551:History
2348:488972
2346:
2301:
2257:
2247:
2208:
2083:
2048:
1996:
1930:SIGMOD
1910:
1900:
1749:
1720:
926:where
2344:S2CID
2324:: 5.
2277:(PDF)
2206:S2CID
2159:(PDF)
2108:arXiv
2081:S2CID
2046:S2CID
1934:arXiv
1908:S2CID
1882:(PDF)
1850:arXiv
1801:(PDF)
1779:(PDF)
1702:(PDF)
2299:PMID
2255:PMID
1994:ISBN
1898:ISBN
1718:ISBN
1431:and
1015:Let
798:The
699:The
599:and
591:and
567:and
559:and
2334:hdl
2326:doi
2289:doi
2245:PMC
2237:doi
2198:doi
2171:doi
2137:doi
2073:doi
2038:doi
1986:doi
1963:doi
1890:doi
1860:doi
1757:doi
1710:doi
1364:of
1059:of
957:set
880:log
563:to
19:In
2364::
2342:.
2332:.
2320:.
2297:.
2285:32
2283:.
2279:.
2253:.
2243:.
2233:63
2231:.
2227:.
2204:.
2194:49
2192:.
2165:.
2161:.
2133:62
2131:.
2104:18
2102:.
2079:.
2069:18
2067:.
2044:.
2034:25
2032:.
1992:.
1957:.
1932:.
1920:^
1906:.
1896:.
1884:.
1858:.
1829:.
1825:.
1813:^
1803:.
1781:.
1755:.
1743:58
1741:.
1716:.
1451:.
1124:12
979:.
539:.
191:x
23:,
2350:.
2336::
2328::
2322:1
2305:.
2291::
2261:.
2239::
2212:.
2200::
2177:.
2173::
2167:9
2143:.
2139::
2116:.
2110::
2087:.
2075::
2052:.
2040::
2002:.
1988::
1969:.
1965::
1959:7
1942:.
1936::
1914:.
1892::
1866:.
1862::
1852::
1831:3
1807:.
1785:.
1763:.
1759::
1726:.
1712::
1439:j
1419:i
1399:k
1377:k
1373:A
1352:)
1349:j
1346:,
1343:i
1340:(
1315:i
1312:j
1308:)
1302:k
1298:A
1294:(
1289:k
1279:n
1274:1
1271:=
1268:j
1253:1
1250:=
1247:k
1239:=
1236:)
1233:i
1230:(
1224:z
1221:t
1218:a
1215:K
1210:C
1196:i
1182:)
1179:i
1176:(
1170:z
1167:t
1164:a
1161:K
1156:C
1135:1
1132:=
1129:)
1121:,
1118:2
1114:a
1110:(
1088:3
1084:A
1073:A
1069:j
1065:i
1061:A
1047:)
1042:j
1039:i
1035:a
1031:(
1017:A
1000:t
967:u
943:)
940:u
937:(
934:N
911:,
904:|
900:)
897:u
894:(
891:N
887:|
876:1
869:)
866:y
863:(
860:N
854:)
851:x
848:(
845:N
839:u
831:=
828:)
825:y
822:,
819:x
816:(
813:A
774:|
770:B
764:A
760:|
753:|
749:B
743:A
739:|
732:=
729:)
726:B
723:,
720:A
717:(
714:J
674:|
670:B
664:A
660:|
655:=
652:)
649:B
646:,
643:A
640:(
637:N
634:C
527:]
524:1
521:,
518:0
515:[
505:E
501::
496:p
492:M
467:E
444:p
440:M
419:}
416:1
413:,
410:0
407:{
397:E
393::
388:b
384:M
359:E
336:b
332:M
303:E
279:1
276:+
273:t
253:t
229:V
208:|
204:V
200:|
178:|
174:V
170:|
163:E
143:V
123:)
120:E
117:,
114:V
111:(
108:=
105:G
73:1
70:+
67:t
47:t
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.