To Örebro University

oru.seÖrebro University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Benchmarks for evaluating the progress of open data adoption: usage, limitations, and lessons learnt
Örebro University, Örebro University School of Business. (Informatics)
Faculty of Technology, Policy and Management, Delft University of Technology, Delft, The Netherlands.
Faculty of Technology, Policy and Management, Delft University of Technology, Delft, The Netherlands.
Örebro University, Örebro University School of Business. (Informatics)ORCID iD: 0000-0002-3713-346X
2015 (English)In: Social science computer review, ISSN 0894-4393, E-ISSN 1552-8286, Vol. 33, no 5, p. 613-630Article in journal (Refereed) Published
Abstract [en]

Public organizations release their data for use by the public to open the government. Various benchmarks for evaluating the progress of open data adoption have emerged recently. In order to help bring about a better understanding of the common and differentiating elements in open data benchmarks and to identify the methodologies and metrics affecting their variation, this article compares open data benchmarks and describes lessons learned from their analysis. An interpretive meta-analysis approach was used and five benchmarks were compared with regard to metadata (key concepts, themes, and metaphors), meta-methods (methodologies underlying the benchmarks) and metatheories (theoretical assumptions at the foundation of the benchmarks). It was found that each benchmark has its strengths and weaknesses and is applicable in specific situations. Since the open data benchmarks have a different scope and focus and use different methodologies, they produce different results in terms of country ranks. There is an obvious gap in both the literature and benchmarks regarding the evolution of end-user practices and individual adoption of open data. Furthermore, lessons are drawn for the development of more comprehensive open data benchmarks and open government evaluation in general. 

 

 

Place, publisher, year, edition, pages
2015. Vol. 33, no 5, p. 613-630
Keywords [en]
open data, maturity, adoption, benchmark, index, open government, evaluation, ranking, open data models
National Category
Information Systems
Research subject
Informatics
Identifiers
URN: urn:nbn:se:oru:diva-40657DOI: 10.1177/0894439314560852ISI: 000360817500007Scopus ID: 2-s2.0-84940913647OAI: oai:DiVA.org:oru-40657DiVA, id: diva2:778247
Available from: 2015-01-09 Created: 2015-01-09 Last updated: 2023-11-07Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Susha, IrynaGrönlund, Åke

Search in DiVA

By author/editor
Susha, IrynaGrönlund, Åke
By organisation
Örebro University School of Business
In the same journal
Social science computer review
Information Systems

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 897 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf