LNCS Homepage
ContentsAuthor IndexSearch

DBpedia Live Extraction

Sebastian Hellmann, Claus Stadler, Jens Lehmann, and Sören Auer

Universität Leipzig, Institute of Computer Science, Johannisgasse 26, 04103 Leipzig, Germany
hellmann@informatik.uni-leipzig.de
mai02jgj@studserv.uni-leipzig.de
lehmann@informatik.uni-leipzig.de
auer@informatik.uni-leipzig.de
http://aksw.org

Abstract. The DBpedia project extracts information from Wikipedia, interlinks it with other knowledge bases, and makes this data available as RDF. So far the DBpedia project has succeeded in creating one of the largest knowledge bases on the Data Web, which is used in many applications and research prototypes. However, the heavy-weight extraction process has been a drawback. It requires manual effort to produce a new release and the extracted information is not up-to-date. We extended DBpedia with a live extraction framework, which is capable of processing tens of thousands of changes per day in order to consume the constant stream of Wikipedia updates. This allows direct modifications of the knowledge base and closer interaction of users with DBpedia. We also show how the Wikipedia community itself is now able to take part in the DBpedia ontology engineering process and that an interactive round-trip engineering between Wikipedia and DBpedia is made possible.

LNCS 5871, p. 1209 ff.

Full article in PDF | BibTeX


lncs@springer.com
© Springer-Verlag Berlin Heidelberg 2009