Harbor Research has published a white paper The Future of Open Data in Smart Cities. It does not put the giants of IT in pole position to capitalise.
The title does not really do justice the contents: the paper argues that the complexities of advanced smart city applications will require new data architectures and drive the success of a new generation of software players. These software players, it says, will take advantage of these requirements to reshape information architectures for advanced IoT applications.
Data sharing will be key to the future of smart cities, and we’ve said plenty on my IoT news site, IoTAustralia about the importance of data sharing in smart city applications, in columns such as: Sharing data – When? Why? How?, and Why open data is a top government criterion, and Smart cities and the gold coin.
We’ve flagged the importance of a data sharing specification by ETSI: Smart city data: it’s good to share, and ETSI’s helping, and the launch of a data sharing marketplace, SynchroniCity shares smart city data, smartly.
In a set of predictions What to expect in IoT in 2019, Chris Penrose, president of AT&T’s IoT organisation, tipped growing importance of data sharing, saying: “Data sharing and advanced analytics will be key to maximising the value of IoT.”
Smart cities span multiple verticals
In the case of the smart city, Harbor Research argues that smart cities sit at the intersection of many verticals — in infrastructure, buildings and transport; and cover public, personal and private domains.
It suggests smart city applications range from simple to complex ones that embrace all relevant city data, require open ecosystem with collaboration and are of high value.
These scenarios, it says, are supported by “multiple parallel technology developments … increasingly reinforcing and accelerating one another.” Individually, these technology developments are powerful, but are increasing their impacts through combinations.
Data sharing par excellence
Furthermore, Harbor Research argues that development of these complex applications requires “Fluid interactions between and among distributed data objects and models.”
It envisages emerging architectures that “create abstraction layer and distributed data identity schema that enable a unified approach for accessing and fusing disparate data types in applications that enable micro-services.” (Data sharing par excellence, in other words).
It says IoT information architectures are moving from ‘application-centric’ data models toward ‘information-centric’ models in which “data value is maximised by enabling disparate data to be combined by diverse applications.”
Much of this could be seen as a reasonable assessment of the smart city ecosystem, today and in the future, but here’s where the white paper might stir the possum.
It claims the “traditional vendors” are trapped in their own silos and do not understand IoT challenges.
“The technical development knowledge that informs incumbent IT, telco and automation/control players’ offerings cumulatively represents the required scope of understanding, but each group individually lacks a complete understanding of future data platform capabilities, integration and deployment requirements.”
These silos are: traditional IT and software systems, traditional telco and carrier systems, and traditional automation and control systems.
A new generation of software players
Stepping into the breach, according to Harbor Research, is a new generation of software players that are “taking advantage of converging computing trends to reshape information architectures for advanced IoT applications.”
In this select cohort Harbor Research names, in order of the significance of their respective innovations: Splunk, Rapidminer, SAS, Thingworx, sqlstream, SkyFoundry, niolabs, Fathym and Pixeom.
The end result of this innovation could be classified as ‘data sharing Nirvana’: pervasive, distributed computing (where the network is the computer) and where “the existence of an equally pervasive, distributed store of knowledge that finally does away with all theoretical barriers between the world’s units of information.”
If Harbor Research is correct in this vision, it seem very unlikely that the ‘traditional vendor’ behemoths whose ranks, according to Harbor Research, include Microsoft, IBM, AWS, Google (IT systems and software); Ericsson, Cisco, Verizon (telco) and Siemens, Honeywell and ABB (automation and control) are going to sit back and let it be realised.
Snapping up of the more successful members of the new generation seems inevitable.