Selected Speakers

Vienna University of Technology, Austria, Senior Researcher

Allan Hanbury is Senior Researcher at the Vienna University of Technology, Austria. He is scientific coordinator of the EU-funded KHRESMOI Integrated Project on medical and health information search and analysis, coordinator of the EU-funded VISCERAL project on evaluation of algorithms on big data, and coordinator of the CHIST-ERA project MUCKE on search in multimodal data and social networks. His research interests include information retrieval, multimodal information retrieval, and the evaluation of information retrieval systems and algorithms. He is author or co-author of over 80 publications in refereed journals and international conferences.
Further Information
Homepage: http://allan.hanbury.eu
VISCERAL project: http://visceral.eu
Khresmoi project: http://khresmoi.eu

Algorithm any good? A Cloud-based Infrastructure for Evaluation on Big Data

The cloud-based infrastructure for evaluating machine learning and information retrieval algorithms on Terabytes of data, being built in the VISCERAL project, is presented. Instead of downloading data and running evaluations locally, the data will be centrally available on the cloud and algorithms to be evaluated will be programmed on the cloud, effectively bringing the algorithms to the data. The design of the VISCERAL infrastructure is presented, concentrating on the components for coordinating the participants in the benchmark and managing the ground truth creation. Future directions of development and potential impact on research and industry are discussed.


Sponge Media Lab, Business Strategists

Andreea Bonea PMP  is a well versed Project Manager Professional, trained in Silicon Valley, who’s expertise and qualifications were retained and further developed by internet giants such as Google and Yahoo Inc. She managed tactical roll outs of key products and platforms for both companies. In her PM role she proved her ability to architect streamlined operational processes, fostering leadership and accountability, to consistently exceed performance goals and expectations.
As a freelance Project Management Professional, she strives to infuse her Silicon Valley corporate tech expertise into the European Non Profit sector. In this respect she’s already had fruitful collaborations with several notable organizations:
Open Knowledge Foundation  (She spearheaded for OKFN the LOD2 initiative; she was instrumental in bringing further technical developments to Publicdata.eu, a Pan European data portal and slowly turning it into a more linked open data initiative, by closely collaborating with the other LOD2 Consortium partners)
Sponge Media Lab (She is one of the main Business Strategists, designing the road map and future initiatives of the Lab).
Mozilla Foundation (She project managed  the Bucharest Open Media Challenge – a hack event funded by the Mozilla Foundation under the Knight-Mozilla Open News program and initiated by the Sponge Media Innovation Lab for Eastern Europe)"

Sponge, a collaborative media innovation lab for Eastern Europe

Problem & solution: a year ago I initiated Sponge, a media innovation lab for eastern Europe located in Bucharest and online. It should become the missing R&D for journalism. But it turned into a collaborative experimentation Lab between independent groups and communities of investigative journalists, coders, designers, activists and legal experts, involving also local universities and students. We are building a hub for innovation occurring at the nexus of transparency and accountability, technology and media. We create an ecosystem fostering relevant and verifiable information and open data.
Study case: with a 3 thousand USD grant from Mozilla Foundation we organized last October the Open Media Challenge in Bucharest, 3 months long contest on open data and collaboration. Using an open process, we had 20 proposals from Bielarus, Russia, Ukraine, Kazakhstan, Moldova and Romania. Out of that, during a weekend hackathon, 10 were developed and 8 delivered a working app or tool (see http://thesponge.eu/Entry_list/). As one member of the jury said, “the Open Media Challenge is a posterbook example how to do a hackathon: good organization, highly motivated participants and awesome ideas/projects that actually delivered”.
Strategy: based on an experimental approach (trial and error) we are turning Sponge into a replicable model for smaller or larger interdisciplinary communities working on open data. Our test bed is the post-communist space, with similar background and so many real world problems that can be solved in a collaborative way also using technology. Our strong points are huge IT and activism knowledge, as well as expertise on journalism and opening data and information; our test bed is great also because of so many differences in language, technical and media literacy, religion and types of government.

 


Science and Technology Facilities Council (STFC), BioMedBridges Security Policy Adviser

Astrid Woollard was born in Vienna (Austria), where she studied for a Master’s in Microbiology and Genetics. During her undergraduate degree, Astrid worked for a diverse range of pharmaceutical and biotechnological companies and gained her first hands-on experience in regulatory affairs and quality control. In 2008, Astrid moved to the UK to read for a DPhil in Plant Sciences at St. Catherine’s College at the University of Oxford while managing her own jewellery business. In 2012, Astrid took up the position of Security Policy Adviser for the BioMedBridges project at the Science and Technology Facilities Council and has since focused on intellectual property and ethical issues arising from data sharing and re-use within this EU project. Astrid’s unique background allows her to address questions that need  to combine both research interests and legal obligations.

Further information:

More information on BioMedBridges
STFC website
Astrid on LinkedIn

Fit for purpose? Big Data and Intellectual Property Rights.

BioMedBridges is an EU FP7 project that connects data across the ESFRI life sciences infrastructures. The right Intellectual Property (IP) policy can give Europe a competitive advantage in exploiting its life sciences research for applications including drug discovery for personalised medicine, agricultural applications of genomic research, and production of chemicals with engineered enzymes. However, finding the right IP policy is challenging as current legislation does not provide all answers to the growing needs of Big Data. BioMedBridges is used as business model example for impact-creation oriented IP policy development that satisfies stakeholder and data value chain demands.


TenForce, Semantic Technology Business Unit Manager

Bastiaan Deblieck is the Business Unit Manager of TenForce’s Semantic Technology unit. This unit delivers advanced content and metadata management solutions. Educated as a linguist and with more than 20 years of international experience in the IT industry, Bastiaan is the ideal person to present business relevant aspects of content & date processing. Bastiaan is a co-founder of TenForce; in previous positions he was a business analyst, project manager and business development manager. He is able to translate a high-level vision and innovative technology into a pragmatic approach that guarantees results. He has been delivering services to and working on projects for the European Commission, Wolters Kluwer, Belgacom, Group 4 Securicor, AXA, and many more.

Further Information:

LinkedIn Information

Who remembers EDP?

TenForce has been successfully implementing software solutions that are the backbones for content and data processing for over a decade now. TenForce Projects are ranging from the "old SQL and XML ways", to the current RDF era, but also preparing for the future as being a member of the FP7 funded LOD2 project.

 

As a result, TenForce has developed methodologies, tools and strategies in order to efficiently process multi-format, complex and non-structured data, including data capture, cleansing, storage, search, sharing, analysis and visualization. TenForce will share insights coming from actual projects for government, multimedia publishing, B2B publishing and banking. So learn "how an SME  like TenForce sees the content and data processing world" and find out the meaning of EDP.


Co-chair W3C Government Linked Data WG

Ms. Hyland is CEO of 3 Round Stones, a Linked Data company based Washington,D.C., and co-chair of the W3C Government Linked Data Working Group. Bernadette is an advocate of open data and Web standards for better knowledge sharing. She has been a founder of several successful startup companies and actively supported Open Source projects related to RDF and Linked Data, including the Mulgara Semantic Store, Persistent URLs and the Callimachus Project. She is an editor of the W3C Best Practices for Government Linked Data and author of peer reviewed chapters on Linked Data.
Accomplishments
Linking Government Data (chapter author, Springer, 2011)
Linked Enterprise Data (chapter author, Springer 2010)

Delivering on Standards for Government Linked Data - A W3C Working Group Report

Three years ago, several pioneering members of the data.gov.uk data transparency effort looked to the Web’s standards organization, the W3C, to help charter a working group to provide standards and other information to help governments around the world publish their data as effective and usable Linked Data. The
Government Linked Data working group, a part of the eGovernment Activity and closely connected with the Semantic Web Activity, has collected and is making available information about government Linked Data activities around the world. It has used that information and the experience of its participants to develop W3C  Recommendations for Best Practices and for RDF Vocabularies necessary for publication of government data as Linked Data.
The W3C Government Linked Data working group is comprised of over 60 group participants, of which more than 20 are active members who meet weekly and are predominantly from Europe. A third of the members are from commercial firms, a third from research / academic institutions and the balance from federal, central government and non-profit organizations (see: http://dir.w3.org/rdf/2012/directory/statistics.xhtml?view). Collectively, the members are involved in government open data projects publishing content of global relevance.
This presentation will highlight the progress of this working group as the charter draws to a close in May 2013. We’ll describe:

  1. The progress of the W3C Government Linked Data working group to date;
  2. Highlight the Community Directory, Best Practices and Standard Vocabularies; and
  3. Describe first steps that government authorities can take to start publishing 4 and 5star linked data to the Web of Data.

Bryan Drexler

Vice President EMEA, Jaspersoft

As Vice President, EMEA Bryan Drexler manages Jaspersoft’s sales and technical operations throughout EMEA. Previous to holding this position Bryan Drexler was responsible for delivering and maintaining Jaspersoft consulting services through its network of partners. Mr. Drexler also played a key role in Jaspersoft establishing its European Headquarters in Dublin. Maintaining
Prior to joining Jaspersoft, Bryan Drexler held practice and project management roles at Prolifics, an IBM Premier Partner, with responsibility for all US federal government and mid-Atlantic commercial accounts. While at Prolifics, he also architected and directed the development of the Quicktime time and attendance application which directs payment for over 250,000 US government employees throughout the world. Previously, Bryan was the Manager of Software Development for Penn, Schoen, and Berland Associates supporting President Bill Clinton, Vice-President Al Gore, and Fortune 500 clients throughout the United States. Bryan is a certified Project Management Professional, holds an MBA from Virginia Tech, and a Bachelors of Science in Computer Science from The American University in Washington, DC. 

Does the 80/20 rule also apply to Big Data?

The 80/20 rule means that 80% of the revenue comes from 20% of a company’s customer base. The development of a reporting or performance measurement system consists of two parts. Firstly data is prepared; secondly, information is transformed into a format which can be understood by the user. This principle also applies when using Big Data technologies. The session begins with an analysis of the current Big Data landscape describing what types of Big Data engines exist and how they can potentially be used with BI platforms. You will see how Jaspersoft makes various Big Data sources accessible to different types of business users.


Open Knowledge Foundation, Community Manager, Open Government Data

Christian is the Open Knowledge Foundation community manager for Open Government Data – and is also part of the community coordination team for the global network of Local Groups. He lives in Copenhagen, Denmark, and has a background in media and culture entrepreneurship, co-working facilitation, community creation, hacktivism – and is a dedicated open-everything advocate.

A one-stop shop for Open Government Data: publicdata.eu improved with social features and more linked data sets

publicdata.eu aggregates an increasing number of data catalogues from throughout the European Union. As part of the LOD2 project, the Open Knowledge Foundation is working with representatives of local, regional and national open data catalogues to have their datasets represented in this single portal. This presentation will showcase many recent improvements on publicdata.eu, such as personalisation features, allowing you to ‘follow’ particular datasets, groups, and users. The number of linked data sets has also significantly been increased, by adding the ability to convert CSV to RDF datasets and allow users to specify the mappings to the right ontologies.


CNGL at Trinity College Dublin, Assistant Research Professor

Dr. David Lewis has 22 years R&D experience in both academia and industry. He has over 120 peer-reviewed publications areas of integrated service and knowledge management and service interoperability in several application domains. He currently leads research on the interoperability of  language technology, content management and localisation systems for Centre for Next Generation Localisation. He is co-chair of the Multilingual Web – Language Technology (MLW-LT) working group at the W3C which is developing ITS2.0, a set of content standard tags for the language services industry. He is co-chair of recent workshops on the Multilingual Semantic Web, MultilingualWeb-Linked Open Data W3C workshop,  and the FEISGILTT interoperability workshop - a neutral forum for standards harmonisation co-located with Localization World.

Linked Data Reuse in the Language Services Industry

The outsourced language services market represents a large international industry with established commercial models for data reuse focussed of sentence and term translations. This paper presents emerging meta-data standards and linked data vocabularies that can now be combined to provide process and provenance annotation of such language data as linked data. These solutions serve both to reduce the current high level of interoperability overhead costs experienced across translation value chains as well as offering new opportunities for commercialising more fine-grained reuse of language data in data-driven language technologies such as machine translation and named entity recognition.


Fingal County Council, Assistant Head of Information Technology

Dominic Byrne is Assistant Head of Information Technology with Fingal County Council and has 20 years of experience in IT.  He is responsible for managing the provision of IT services in Fingal and his current interests include Open Data, Open Government, Social Media, and Knowledge Management.  He is responsible for Fingal Open Data which was the first Open Government Data website in Ireland.  He is also a member of the Dublinked management team and the National Open Data Working Group.  In addition, he is a member of the Digital Dublin Forum which is currently working on a Digital Strategy for the Dublin Region.

Further Information:

http://data.fingal.ie
http://twitter.com/fingalopendata
http://www.slideshare.net/fingalopendata
http://www.dublinked.ie
http://twitter.com/dublinked

Irish Open Data Reuse Exemplars

The release of Irish Open Government Data has been led by the Local Authorities in Dublin, starting with Fingal County Council’s Fingal Open Data initiative and followed by the Dublin Region’s Dublinked project. This presentation will review the reuse of this data to date with particular focus on startup companies and outline the data-driven innovation network approach of the Dublinked project.


Office for Personal Data Protection, Director of Analytics and Public Sector Data Processing Department

František Nonnemann, Office for Personal Data Protection of the Czech Republic, Director of Analytics and Public Sector Data Processing Department.

Born 1981, has been working for the Office since 2006. Among other topics, he is dealing with legal relations between freedom of information and re-use of public sector information and personal data protection. He has published several professional articles in the Czech legal journals and made several speeches about this topic, in the Czech Republic mainly for public servants and abroad as TAIEX expert in the Republic of Macedonia in October 2012.

Further Information:

http://www.uoou.cz

Re-use of public sector information and personal data protection

Among the information collected, stored, disseminated and otherwise processed by the public sector which also can be re-used within the meaning of the PSI Directive, there is plenty of data that in the sense of the Data Protection Directive can be regarded as personal. The data protection principles may sometimes constitute barriers for the re-use of this type of information. My presentation will deal with the relations between the re-use of PSI and the data protection on EU level and with the Czech legislative which establishes specific, and hopefully inspirational, legal regime for processing of lawfully published personal data covering the legal base for their processing as well as the information and notification obligation of processor of those data etc.


Ghislain Auguste Atemezing and Raphael Troncy

EURECOM, Campus SophiaTech, France

Ghislain Auguste Atemezing is currently a PhD candidate at Telecom ParisTech, under the supervision of Raphaël Troncy, based at Eurecom. (http://www.eurecom.fr/~atemezin/#bio)

Towards Interoperable Visualization Applications Over Linked Data

This talk provides an overview of types of applications built on Linked Data, a framework for assessing tools used for building those applications, and a vocabulary to describe visual applications developed on top of LOD for more interoperability and components reusability. Our proposal derives from a study of applications in dierent domains built in Open Data initiatives in UK, US and France. We identify two benets of our approach: (i) it could help building a sustainable ecosystem of visual applications making use of 4-5 stars datasets, and (ii) it could make it easier to find application patterns used for creating Linked Data Applications.


PwC Technology Consulting, Senior Manager

João R. Frade is a Senior Manager with PwC Technology Consulting practice focusing on the delivery of consultancy services to the European Institutions. João assisted numerous organisations, in several sectors, as a business and technology Enterprise Architect. He has extensive experience in project managing complex assignments and on evaluating, selecting and design ing ICT based solutions which enable organisations, from the private or public sector, to deliver their value proposition. João is involved in the Semantic Interoperability action of the ISA Programme of the European Commission since 2007.

Further Information:

Business models for Linked Government Data?

Linked data provides a new perspective on how data from open and closed sources can be integrated. Despite often used in the context of open data, not all linked data is open and of course not all open data has to be linked.

Different business models for linked data have been proposed so far, including subscription-based access, loss-leader models, data markets, and business models based on data mash-ups, data analytics, product and service recommendation, and geospatial data representation.    

Reporting on a study currently conducted by PwC for Action 1.1 on semantic interoperability of the ISA Programme of the European Commission, this presentation will discuss drivers, requirements and challenges for possible business models for linked (open) data focusing on government data. We will talk about cost drivers and pricing of linked data and its benefits compared with un-linked data. We will pinpoint potential end-users and pitch the role of the developer community, featuring also to selected case studies from around the world.


Head of Legislation Services at The UK National Archives

John Sheridan is Head of Legislation Services at The [UK] National Archives, where he leads the team responsible for legislation.gov.uk, the UK's official legislation website and the world's first linked data statute book. As Head of e-Services and Strategy at the Office of Public Sector Information he is one of the leading pioneers of open and linked data in the UK. In his current role, John devised "expert participation", an approach for maintaining the UK Government's official legislation database as open data, working with the private sector. He is an active contributor to initiatives at a European level to develop linked legislation data. John is a former co-chair of the World Wide Web Consortium's e-Government Interest Group.

Good Law from Open Data

In the context of the recent European Legislation Identifier initiative and new EU e-Law Linked Open Data taskforce, this presentation will survey the opportunities of Linked Open Data to legislative and legal compliance data. What does it mean to talk about legislation as data? How can open data work as an operating model for the public sector to enable this? Where and how do Linked Data standards fit? What are the commercial opportunities for use and exploitation of law as data? This presentation will showcase recent projects and explore the Europe wide possibilities and benefits from Linked Open Legislation Data.


DAMA-UPC, Universitat Politècnica de Catalunya, BarcelonaTech, Director

Josep-L. Larriba is the director of the DAMA-UPC research group since its creation in 1999. His main areas of interest are graph database technology and applications, benchmarking and data management in general. He is one of the founders of Sparsity Technologies, a spin out of UPC that commercializes and creates technologies around graph databases. He has been involved in different national and international projects with companies like IBM and CA Technologies, and he is now coordinating the FP7-ICT project LDBC.

The Linked Data Benchmark Council, benchmarking RDF and Graph technologies

In this brief talk we will present the EC funded project “Linked Data Benchmark Council” (LDBC, www.ldbc.eu). The goals of the project are to create the first comprehensive suite of open, fair and vendor-neutral benchmarks for RDF/graph databases together with the LDBC foundation which will de fine processes for obtaining, auditing and publishing results. The project will carry a significant amount of work in the methodology of benchmark creation, the decision and evolution of the workloads which include the use case selection, creation of queries and generation of data.


Keiran Millard

Group Manager, SeaZone

Keiran is the manager of the SeaZone group of HR Wallingford Ltd.  SeaZone is an SME that specialises in the supply of marine mapping and data products to support commercial and responsible exploitation of the marine environment.  SeaZone Hydrospatial is the most widely used vector marine mapping product for marine spatial planning.  Keiran has been awarded a MPhil degree in measurement information systems, a BEng in control and instrumentation systems and a MSc in marine and coastal management.  Keiran has 20 years national and international experience and his key expertise is in information policy and technology for marine management.  At a European level he is leading the development the Inspire data specifications for ocean and marine data.  Keiran has been on the steering committee of twelve European Commission projects advancing information provision for coastal and marine management, including three as co-ordinator.  He is currently leading the exploitation of the IQmulus project.

Frankenstein’s Data

Big Data meets Open Data.  Creating benefit for the economy or a monster that we will not be able to control?

This is a question being asked within the  IQmulus project in the context of how do we exploit, on a sustainable basis, the products and services we are researching.  IQmulus examines big geospatial data; the data from sources such as LIDAR, SONAR and numerical simulations; these data are simply too big for routine and ad-hoc analysis, yet they could realise a myriad of disparate, and readily useable, information products with the right infrastructure in place. IQmulus is setting itself out to be this infrastructure, but does Open Data policy allow for its sustainable operation?    

"Data is the raw material of the digital age" it is often cited and "open data" has been adopted as the way to drive this digital economy forward.  The rationale for this is simple: if (public) data is thrown to the wider (private and academic) community on a perpetual, no fee basis then other communities will arise that make use of the data in interesting and important ways.  Those that are useful will thrive, and those of no value will wither and die.  As stated, the primary rationale for this is data created by public bodies.  Everyone wins - the public sector does the minimum, private sector adds value and earns an income from selling these products (driving economic growth) and citizens are empowered with services that go well beyond the original data.  However experience to date has highlighted some cautionary tales that indicate that this vision isn’t working in practice and it is forcing how private data services like IQmulus (at a later stage) could operate on a financially sustainable basis.


The Norwegian Meteorological Institute, IT law professional

Lyng is specialized in ICT-law and is responsible for the data policy at The Norwegian Meteorological Institute, which includes licensing of meteorological data and products, licensing of software and scientific work. Her background also includes an extensive experience working with ICT contracts and negotiating national and international procurements and multilateral co-operations.

She holds several positions in national and international advisory boards and working groups which aim at improving the exchange of scientific and meteorological information. Lyng has furthermore contributed on the national level to increase knowledge on open data and licensing issues through her numerous resentations on seminars and conferences. She has also contributed as member of the open data guidelines secretariat, appointed by The Norwegian Ministry of Government Administration, Reform and Church Affairs. The secretariat has facilitated a process which has led to a set of guidelines for open data in Norway

Openness as a public strategy: Make open, make available

When The Norwegian Meteorological Institute, together with The Norwegian Broadcasting Corporation, launched the new weather service yr.no in 2007, it quickly became one of the most popular websites in Norway. Five years later, statistics show that the majority of the Norwegian population are frequent users of yr.no, also serving as a motivation for a first introduction to the Internet for many senior citizens.  
Besides covering the need for weather information for end users, with the making of yr.no a revolution took place: The institute decided to stop selling weather information that was produced by the core service. Hence, all data and products that you can see on yr.no as numbers, figures and animations are available as open data. The consequence was that we gave up a marginal income in favour for the society at large.
In the context of empowering enterprises which are oriented towards new and innovative technologies, we believe a new attitude of openness within the public sector is one important source we have to explore and exploit. In order to do so, there is a need of a change in attitude in the governmental institutions and agencies on practicing openness as more than obeying the legal duty of transparency. The new openness is about making the results of our work available to the public, the ones that funds our service. Our work should be available for re-use with no restrictions. We believe that allowing re-users to have free access to data will have a huge potential to become an important contribution to foster innovation and value creation in the business sector.
At The Norwegian Meteorological Institute we are practicing this new way of openness in many ways: Our data and products are available as machine readable data (raw data).The institute provides downloading facilities as an integrated part of the yr.no-systems. Our systems are designed to take both care of our core services and the re-users, a design that had the re-users in mind from the start. Furthermore, software programs that are developed in-house are made available through open licenses; software can be freely used by anyone for their own purposes.  Our research department has a significant production of scientific publications. These are published under “open access” licenses, and are available for anyone, whether they are researchers, scholars or entrepreneurs looking for new business opportunities.
Our policies are in accordance with the spirit of the PSI Directive and the goals in the ICT policy of the Norwegian Government. One of the priority areas in the ICT policy is “contributing to innovation and value creation in the business sector by arranging for development and use of services based on a digital content, making public data accessible for further use and distribution […]”.
My goal in this lecture is to inspire the participants of the European Data Forum with our unique experience as a case study, and experience related to both the legal environment and experience from our dialogue with the re-use community.  This way we could inspire to embrace this new way of thinking, giving some eye-opening examples from our extensive experience. I will try to show that this can foster innovation in more than one way – and in ways that we have not could foresee some years ago. For instance, in windmill projects, smart house technology and construction of ships hulls. 


European Commission - Joint Research Centre, Scientific/Technical Project Officer

Michael Lutz is a scientific officer in the "Digital Earth and Reference Data" unit at the Joint Research Centre of the European Commission. Michael's research interests include the semantic modeling of geospatial data and processes to support the discovery, access, and re-use of geographic information. As the technical and data modelling contact point for the INSPIRE [1] data specifications, he has been supporting the work of 19 expert groups to ensure cross-thematic harmonisation of the INSPIRE data models and cross-cutting architectural consistency with other aspects of INSPIRE (e.g.  metadata, services and registries). He has been applying INSPIRE principles to the wider domains of e-Government and the Semantic Web as co-chair of the Core Location Vocabulary Working Group [2] and the W3C Location and Addresses Community Group [3] and in the context of the EU Interoperability Solutions for European Public Administrations (ISA) Programme [4]. Michael has a PhD in geoinformatics from the University of Münster, Germany.

[1] http://inspire.jrc.ec.europa.eu/
[2] http://joinup.ec.europa.eu/asset/core_location/description
[3] http://www.w3.org/community/locadd/
[4] http://ec.europa.eu/isa/

Data interoperability across sectors and borders - INSPIRE and beyond

For the last 10 years, INSPIRE (Infrastructure for spatial information in Europe) has been developing  a legal framework and interoperability guidelines to facilitate pan-European and cross-border access to geospatial data from diverse domains (e.g. addresses, transport networks, geology and natural hazards). One of the key success factors for INSPIRE has been the active participation of several hundred data practitioners and users from across the EU Member States. As public administrations across Europe are starting the implementation of  INSPIRE, there is a growing interest in combining INSPIRE data with other information from the public and private sectors to create innovative products and services, adding to data value chains. The presentation will highlight INSPIRE solutions relevant for other sectors and cross-sector interoperability.


Principal Advisor with PwC Technology Consulting

Nikos Loutas
is a Principal Advisor with PwC Technology Consulting practice involved mainly in Linked Open Data and Semantic Interoperability projects for the European Commission. Since 2006, Nikos has been working on Semantic Web and e-Government research at the DERI, NUI Galway and the Centre for Research Technology Hellas (CERTH). His interest in semantic technologies and e-Government, brought him closer to the rapidly evolving field of Linked Open Government Data (LOGD). Nikos is now working towards industrialising LOGD through the development of shared vocabularies and services that facilitate data publication and reuse, thus revealing the untapped social and economic potential of LOGD.
 

Further Information:

Business models for Linked Government Data?

Linked data provides a new perspective on how data from open and closed sources can be integrated. Despite often used in the context of open data, not all linked data is open and of course not all open data has to be linked.

Different business models for linked data have been proposed so far, including subscription-based access, loss-leader models, data markets, and business models based on data mash-ups, data analytics, product and service recommendation, and geospatial data representation.    

Reporting on a study currently conducted by PwC for Action 1.1 on semantic interoperability of the ISA Programme of the European Commission, this presentation will discuss drivers, requirements and challenges for possible business models for linked (open) data focusing on government data. We will talk about cost drivers and pricing of linked data and its benefits compared with un-linked data. We will pinpoint potential end-users and pitch the role of the developer community, featuring also to selected case studies from around the world.

 


Peter Haase and Arild Waaler

Dr. Peter Haase is working as a lead architect at fluid Operations, where he is leading the research and development activities at the interface of semantic technologies and cloud computing. Previously, Peter was at the Institute of Applied Informatics and Formal Description Methods (AIFB) at the University of Karlsruhe, where he obtained his PhD in 2006. Before joining the AIFB, he worked in the Silicon Valley Labs of IBM in the development of DB2 until 2003. His research interests include ontology management and evolution, decentralized information systems and Semantic Web. At the AIFB, he previously worked in the EU IST project SWAP (Semantic Web and Peer-to-Peer) and SEKT (Semantically Enabled Knowledge Technologies) and was working as a project leader for the EU IST project NeOn (Lifecycle Support for Networked Ontologies).
 
Prof. Arild Waaler is professor in computer science at the University of Oslo. He as led 5 projects at the national level in Norway and was an initiator of the Euro 12M joint industry initiative Integrated Operations in the High North and the Euro 7M Semicolon project; the former of these addresses integration challenges in the oil and gas industry, the latter interoperability challenges in the Norwegian public sector. He chaired the 2010 Semantic Days conference, a meeting point with academia, industry and public sector, and is now co-chair of the Board. Currently, he serves as project coordinator for the Integrating Project Optique.

Scalable End-user Access to Big Data

Scalable end-user access to Big Data is critical for effective data analysis and value creation: Engineers in industry spend a significant amount of their time searching for data that they require for their core tasks. In the Oil&Gas industry, for instance, 30–70% of engineers’ time is spent looking for and assessing the quality of data.
Optique is a large-scale European project focusing on the comprehensive and timely end-user access to very large data sets. In this presentation we describe how the concepts and technologies developed in Optique bring about a paradigm shift for data access by

  • providing a semantic end-to-end connection between users and data sources,
  • enabling users to rapidly formulate intuitive queries using familiar vocabularies and conceptualizations,
  • seamlessly accessing data spread across multiple distributed data sources, and thus
  • reducing the turnaround time for information requests from days to minutes.

In the talk we will discuss the key concepts behind the Optique platform, including

  • the central role of ontologies and declarative mappings to capture user conceptualizations and to transform user queries into highly optimized queries over the data sources,
  • integration of distributed heterogeneous data sources, including streams,
  • the exploitation of massively parallel technologies and holistic optimizations to maximize performance,
  • tools to support query formulation and ontology and mapping management, and
  • semi-automatic bootstrapping of ontologies and mappings and query driven ontology construction to minimize implementation overhead.

We will illustrate the value of the Optique platform through use cases from the case study partners in the energy domain: Siemens and Statoil.


Couchbase Inc, Technical Advocat

Over the last few years Robin Johnson has been involved in some of the Leading Social Gaming and Betting applications to grace the world wide web. After bootstrapping an Award Winning Social Gaming startup, Robin is a seasoned Distributed System and NoSQL advocate. A polyglot Rubyist and Pythoner all in one, taking on roles from Frontend Developer to Software Engineer and Data Architect.

Big Data Ad-Targeting applications with NoSQL and Hadoop

We ask much more of our data than ever before. Not only does it need to be available in milliseconds when requested, but we also want to be able to extract every last bit of intelligence from the data to make decisions, which make our application s more effective.
During  this  presentation  you  will  see  how  a  mix  of  Hadoop  and  a  NoSQL  engine,  such  as  Couchbase,  to  create  an  Ad Targeting  platform  to  analyze  billions  of  user related  events  and  millions  of  user  profiles,  and  deliver  the  correct  ad  in  a  handful  of  milliseconds.  Based  on  this  concrete  example  you  will  see  how  these  technologies  provide  you  with  all  of  the  tools  to  deal  with  new  types  of  applications  and  scale up  (in  terms  of  data  or  users). This  tutorial  is  beneficial  for  any  attendee;  as  they  will  learn  from  a  production  large scale  use  case:

  • The  challenges  of  new  applications  dealing  with  Big  Data  (especially  how  to  deliver  them  quickly)

 

  • The  differences  between  the  “Analytical”  part  of  the  system  (Hadoop)  and  the  “Operational”  part  (Couchbase)

Open Knowledge Foundation, Head of Long Term Projects

Sander van der Waal is Head of Long Term Projects at the Open Knowledge Foundation. Overseeing many of the grant funded projects the OKF is involved with, Sander ensures any funded project will benefit maximally from cross-fertilization with other related initiatives across the wider international Open Knowledge   Foundation community. Sander has a strong background in the open source world, having advised academic projects on open source software development at OSS Watch, based at the University of Oxford. He is a committer and contributor to the Apache Software Foundation and co-organised several big and smaller open source events.

A one-stop shop for Open Government Data: publicdata.eu improved with social features and more linked data sets

publicdata.eu aggregates an increasing number of data catalogues from throughout the European Union. As part of the LOD2 project, the Open Knowledge Foundation is working with representatives of local, regional and national open data catalogues to have their datasets represented in this single portal. This presentation will showcase many recent improvements on publicdata.eu, such as personalisation features, allowing you to ‘follow’ particular datasets, groups, and users. The number of linked data sets has also significantly been increased, by adding the ability to convert CSV to RDF datasets and allow users to specify the mappings to the right ontologies


2ndQuadrant, CTO

Simon Riggs is CTO of 2ndQuadrant, a Major Developer and Committer on the PostgreSQL open soutrce database project and Technical Coordinator of the AXLE project for Big Data. Simon has 25 years experience as a scientific researcher, systems software developer and solution architect for large database systems.

Practical PostgreSQL

A look at how and where the PostgreSQL advanced open source database is being used in European science and industry, for both big and small data, with discussion of how things are changing over time. Talk includes a look at the various requirements big data imposes for various use cases and how we currently cope  with these challenges. We also discuss the AXLE project and its impact and importance in various industries.


Romanian Centre for Investigative Journalism, Co-Founder

Stefan CANDEA, Co-Founder, Romanian Centre for Investigative Journalism. Candea is a member of the International Consortium for Investigative Journalists and has won several awards including the IRE Tom Renner Award and the Overseas Press Club of America Award for online journalism. He teaches investigative journalism at Bucharest University and was the 2011 Carroll Binder Nieman Fellow at Harvard University.

Stefan is a freelance journalist and co-founder of the Romanian Centre for Investigative Journalism in Bucharest, Romania. He's written about the connections between international organized crime networks and high-ranking politicians, as well as international arms trade and illegal international adoptions.

Stefan worked for the first investigative TV show in Romania, Reporteri Incognito. He's also worked for Deutsche Welle and in print, radio, TV and online. He's done freelance research and production work for several foreign media outlets, including the BBC, Channel 4, ITN, ZDF and Canal Plus. Since March 2001, he has been a correspondent for Reporters sans Frontieres in Romania.

In 2011 Stefan started the Sponge (thesponge.eu), a collaborative media innovation lab and currently is working on turning a reporting blog about the Black Sea (theblacksea.eu) into an online in-depth magazine for the region.
Further Information
http://www.thesponge.eu
http://www.nieman.harvard.edu/reports/issue/100067/Spring-2011.aspx

Sponge, a collaborative media innovation lab for Eastern Europe

Problem & solution: a year ago I initiated Sponge, a media innovation lab for eastern Europe located in Bucharest and online. It should become the missing R&D for journalism. But it turned into a collaborative experimentation Lab between independent groups and communities of investigative journalists, coders, designers, activists and legal experts, involving also local universities and students. We are building a hub for innovation occurring at the nexus of transparency and accountability, technology and media. We create an ecosystem fostering relevant and verifiable information and open data.

Study case: with a 3 thousand USD grant from Mozilla Foundation we organized last October the Open Media Challenge in Bucharest, 3 months long contest on open data and collaboration. Using an open process, we had 20 proposals from Bielarus, Russia, Ukraine, Kazakhstan, Moldova and Romania. Out of that, during a weekend hackathon, 10 were developed and 8 delivered a working app or tool (see http://thesponge.eu/Entry_list/). As one member of the jury said, “the Open Media Challenge is a posterbook example how to do a hackathon: good organization, highly motivated participants and awesome ideas/projects that actually delivered”.

Strategy: based on an experimental approach (trial and error) we are turning Sponge into a replicable model for smaller or larger interdisciplinary communities working on open data. Our test bed is the post-communist space, with similar background and so many real world problems that can be solved in a collaborative way also using technology. Our strong points are huge IT and activism knowledge, as well as expertise on journalism and opening data and information; our test bed is great also because of so many differences in language, technical and media literacy, religion and types of government.


European Environment Agency

Søren Roug has been employed by EEA for 14 years and has worked on Reportnet from the beginning. He is now the head of the software development group with overall responsibility for integration of the subsystems.

Linked Data case study: Reportnet

The European Environment Agency receives statistics about the environment from the member countries and has built a reporting system called Reportnet to handle the deliveries. The presentation will show how EEA is integrating XML technologies with Linked Data for quality assessment, merging and use of the data. We’re using XQuery to report on QA issues. The XQuery scripts make SPARQL queries to check codes against reference data and previous deliveries. Then the data is transformed from XML to RDF with XSL-T and imported into a triple store. We use SPARQL to run another type of QA and to infer links to other deliveries. Finally, some visualisations of the data will be shown.


Tyler Tate

Cofounder of TwigKit

Tyler Tate is the cofounder of TwigKit - a London-based software company that provides tools for rapidly building search-based applications. He is also coauthor of Designing the Search Experience, a book recently published by Morgan Kaufmann, and has written articles for the likes of A List ApartBoxes & ArrowsUX Matters, and UX Magazine. In the past Tyler lead design at Nutshell CRM, designed an enterprise content management system, ran a small design studio, and taught a university course on web design. He blogs at tylertate.com, tweets as @tylertate, and currently lives in Cambridgeshire, England.

Information Wayfinding: The Anthropology of Big Data

Web pages are dying. Organized into websites with navigation and keyword search in the same way that printed pages were organized into books with a table of contents and an index, web pages and their book-like metaphor are a throwback to the past. In the future, our interactions with information will be less like flipping through a book, and more like exploring a museum, navigating a city, or wandering through nature itself. In other words, we are entering a new era of information wayfinding.

Fortunately, the fields of wayfinding and human-information interaction can teach us a good deal about how people find their way through physical and information environments alike. In this forward-looking talk Tyler Tate, author of Designing the Search Experience and cofounder of Twigkit, begins by synthesizing ideas from each of these fields. He then combines this cross-fertilization of ideas with a set of design principles for creating successful information wayfinding experiences, including:

1. Unified interaction

2. Positional cues

3. Survey views

4. Paths onward

5. A path back to known territory

Most importantly, Tyler then turns the theory in practice by demonstrating hands-on techniques that can be applied to your current project, such as: moving from navigation to faceted navigation, using human-readable URLs, offering flexible controls, providing contextual search, using integrated breadcrumbs effectively, providing topic views, and offering onward-leading detail views.

As web pages increasingly become a thing of the past, users require new means of interacting with an ever-expanding landscape of information. Welcome to the era of information wayfinding.