This repository contains modules for Linked Data Fragments (LDF) servers.
Find more information about migrating from ldf-server
2.x.x on our wiki.
On today's Web, Linked Data is published in different ways, which include data dumps, subject pages, and results of SPARQL queries. We call each such part a Linked Data Fragment.
The issue with the current Linked Data Fragments is that they are either so powerful that their servers suffer from low availability rates (as is the case with SPARQL), or either don't allow efficient querying.
Instead, this server offers Quad Pattern Fragments (a.k.a. Triple Pattern Fragments). Each Quad Pattern Fragment offers:
- data that corresponds to a quad/triple pattern (example).
- metadata that consists of the (approximate) total triple count (example).
- controls that lead to all other fragments of the same dataset (example).
An example server is available at data.linkeddatafragments.org.
The easiest way to start using this server is via
@ldf/server
. (previously known as ldf-server
)
This server requires Node.js 10.0 or higher and is tested on OSX and Linux. To install, execute:
$ [sudo] npm install -g @ldf/server
First, create a configuration file config.json
similar to config/config-example.json
,
in which you detail your data sources.
For example, this configuration uses an HDT file
and a SPARQL endpoint as sources:
{
"@context": "https://linkedsoftwaredependencies.org/bundles/npm/@ldf/server/^3.0.0/components/context.jsonld",
"@id": "urn:ldf-server:my",
"import": "preset-qpf:config-defaults.json",
"title": "My Linked Data Fragments server",
"datasources": [
{
"@id": "ex:myHdtDatasource",
"@type": "HdtDatasource",
"datasourceTitle": "DBpedia 2014",
"description": "DBpedia 2014 with an HDT back-end",
"datasourcePath": "dbpedia",
"hdtFile": "data/dbpedia2014.hdt"
},
{
"@id": "ex:mySparqlDatasource",
"@type": "SparqlDatasource",
"datasourceTitle": "DBpedia (Virtuoso)",
"description": "DBpedia with a Virtuoso back-end",
"datasourcePath": "dbpedia-sparql",
"sparqlEndpoint": "https://dbpedia.org/sparql"
}
]
}
More details on how to configure this server can be found in the README of @ldf/server
.
After creating a configuration file, execute
$ ldf-server config.json 5000 4
Here, 5000
is the HTTP port on which the server will listen,
and 4
the number of worker processes.
Now visit http://localhost:5000/
in your browser.
This repository should be used by LDF Server module developers as it contains multiple LDF Server modules that can be composed. We manage this repository as a monorepo using Lerna.
The following modules are available:
@ldf/core
: Shared functionality for LDF servers.@ldf/server
: An LDF server with Quad/Triple Pattern Fragments support.@ldf/preset-qpf
: Configuration presets for Quad/Triple Pattern Fragments servers.@ldf/feature-qpf
: Feature that enables Quad Pattern Fragments (a.k.a. Triple Pattern Fragments).@ldf/feature-summary
: Feature that adds summaries to datasources.@ldf/feature-memento
: Feature that enables datetime negotiation using the Memento Protocol.@ldf/feature-webid
: Feature that enables authenticated requests from clients with WebID.@ldf/datasource-hdt
: Datasource that allows HDT files to be loaded.@ldf/datasource-jsonld
: Datasource that allows JSON-LD files to be loaded.@ldf/datasource-rdfa
: Datasource that allows RDFa files to be loaded.@ldf/datasource-n3
: Datasource that allows N-Quads, N-Triples, Trig and Turtle files to be loaded.@ldf/datasource-sparql
: Datasource that allows SPARQL endpoints to be used as a data proxy.@ldf/datasource-composite
: Datasource that delegates queries to an sequence of other datasources.
These modules can be used to configure your own LDF server with the features you want.
As an example on how to make such a server,
you can have a look at @ldf/server
,
which is a default server configuration that has all possible features enabled.
If you want to develop new features or use the (potentially unstable) in-development version, you can set up a development environment for this server.
LDF Server requires Node.JS 10.0 or higher and the Yarn package manager. LDF Server is tested on OSX, Linux and Windows.
This project can be setup by cloning and installing it as follows:
$ git clone https://github.com/LinkedDataFragments/Server.js.git
$ cd Server.js
$ yarn install
Note: npm install
is not supported at the moment, as this project makes use of Yarn's workspaces functionality
This will install the dependencies of all modules, and bootstrap the Lerna monorepo.
After that, all LDF Server packages are available in the packages/
folder
and can be used in a development environment.
Furthermore, this will add pre-commit hooks
to build, lint and test.
These hooks can temporarily be disabled at your own risk by adding the -n
flag to the commit command.
The Linked Data Fragments server is written by Ruben Verborgh, Miel Vander Sande, Ruben Taelman and colleagues.
This code is copyrighted by Ghent University – imec and released under the MIT license.