Proceedings Article, Paper
@InProceedings
Beitrag in Tagungsband, Workshop


Show entries of:

this year (2019) | last year (2018) | two years ago (2017) | Notes URL

Action:

login to update

Options:








Author, Editor

Author(s):

Suchanek, Fabian
Kasneci, Gjergji
Weikum, Gerhard

dblp
dblp
dblp



Editor(s):

Williamson, Carey L.
Zurko, Mary Ellen
Patel-Schneider, Peter F. Shenoy, Prashant J.

dblp
dblp
dblp

Not MPII Editor(s):

Williamson, Carey L.
Zurko, Mary Ellen
Patel-Schneider, Peter F. Shenoy, Prashant J.

BibTeX cite key*:

SuchanekKW2007

Title, Booktitle

Title*:

YAGO: A Core of Semantic Knowledge - Unifying WordNet and Wikipedia


yago_www2007.pdf (185.59 KB)

Booktitle*:

16th International World Wide Web Conference (WWW 2007)

Event, URLs

URL of the conference:

http://www2007.org/

URL for downloading the paper:

http://doi.acm.org/10.1145/1242572.1242667

Event Address*:

Banff, Canada

Language:

English

Event Date*
(no longer used):


Organization:


Event Start Date:

8 May 2007

Event End Date:

12 May 2007

Publisher

Name*:

ACM

URL:


Address*:

New York, USA

Type:


Vol, No, Year, pp.

Series:


Volume:


Number:


Month:


Pages:

697-706

Year*:

2007

VG Wort Pages:

45

ISBN/ISSN:

978-1-59593-654-7

Sequence Number:


DOI:

http://doi.acm.org/10.1145/1242572.1242667



Note, Abstract, ©


(LaTeX) Abstract:

We present {YAGO}, a light-weight and extensible ontology with high coverage and quality. {YAGO} builds on entities and relations and currently contains roughly 900,000 entities and 5,000,000 facts. This includes the Is-A hierarchy as well as non-taxonomic relations between entities (such as hasWonPrize). The facts have been automatically extracted from the unification of Wikipedia and WordNet, using a carefully designed combination of rule-based and heuristic methods described in this paper. The resulting knowledge base is a major step beyond WordNet: in quality by adding knowledge about individuals like persons, organizations, products, etc. with their semantic relationships -- and in quantity by increasing the number of facts by more than an order of magnitude. Our empirical evaluation of fact correctness shows an accuracy of about 95%. {YAGO} is based on a logically clean model, which is decidable, extensible, and compatible with {RDFS}. Finally, we show how {YAGO} can be further extended by state-of-the-art information extraction techniques.

Keywords:

Wikipedia, WordNet



Download
Access Level:

Public

Correlation

MPG Unit:

Max-Planck-Institut für Informatik



MPG Subunit:

Databases and Information Systems Group

Audience:

popular

Appearance:

MPII WWW Server, MPII FTP Server, MPG publications list, university publications list, working group publication list, Fachbeirat, VG Wort



BibTeX Entry:

@INPROCEEDINGS{SuchanekKW2007,
AUTHOR = {Suchanek, Fabian and Kasneci, Gjergji and Weikum, Gerhard},
EDITOR = {Williamson, Carey L. and Zurko, Mary Ellen and Patel-Schneider, Peter F. Shenoy, Prashant J.},
TITLE = {{YAGO}: A Core of Semantic Knowledge - Unifying {WordNet} and {Wikipedia}},
BOOKTITLE = {16th International World Wide Web Conference (WWW 2007)},
PUBLISHER = {ACM},
YEAR = {2007},
PAGES = {697--706},
ADDRESS = {Banff, Canada},
ISBN = {978-1-59593-654-7},
DOI = {http://doi.acm.org/10.1145/1242572.1242667},
}


Entry last modified by Martin Theobald, 04/14/2009
Show details for Edit History (please click the blue arrow to see the details)Edit History (please click the blue arrow to see the details)
Hide details for Edit History (please click the blue arrow to see the details)Edit History (please click the blue arrow to see the details)

Editor(s)
Ralf Schenkel
Created
02/13/2007 03:14:46 PM
Revisions
8.
7.
6.
5.
4.
Editor(s)
Martin Theobald
Olha Condor
Adriana Davidescu
Adriana Davidescu
Adriana Davidescu
Edit Dates
04/14/2009 04:26:28 PM
18.11.2008 13:06:48
04.01.2008 12:48:17
30.10.2007 16:50:06
30.10.2007 16:49:57
Show details for Attachment SectionAttachment Section
Hide details for Attachment SectionAttachment Section
yago_www2007.pdf
View attachments here: